Mar 11 00:54:02 crc systemd[1]: Starting Kubernetes Kubelet... Mar 11 00:54:02 crc restorecon[4743]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:02 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 00:54:03 crc restorecon[4743]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 11 00:54:03 crc kubenswrapper[4744]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.660345 4744 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665734 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665768 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665778 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665786 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665797 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665807 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665816 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665825 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665834 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665842 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665851 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665859 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665867 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665874 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665882 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665892 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665903 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665912 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665920 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665928 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665937 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665970 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665979 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665989 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.665997 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666005 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666013 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666021 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666029 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666037 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666044 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666052 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666059 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666067 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666075 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666083 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666090 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666098 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666107 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666116 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666124 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666131 4744 feature_gate.go:330] unrecognized feature gate: Example Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666139 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666147 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666154 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666163 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666171 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666180 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666189 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666197 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666207 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666218 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666229 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666238 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666246 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666253 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666261 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666272 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666280 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666288 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666295 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666305 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666314 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666323 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666331 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666339 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666347 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666356 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666364 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666372 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.666380 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667670 4744 flags.go:64] FLAG: --address="0.0.0.0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667701 4744 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667723 4744 flags.go:64] FLAG: --anonymous-auth="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667736 4744 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667749 4744 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667760 4744 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667773 4744 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667788 4744 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667799 4744 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667810 4744 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667821 4744 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667832 4744 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667842 4744 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667854 4744 flags.go:64] FLAG: --cgroup-root="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667864 4744 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667875 4744 flags.go:64] FLAG: --client-ca-file="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667885 4744 flags.go:64] FLAG: --cloud-config="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667895 4744 flags.go:64] FLAG: --cloud-provider="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667905 4744 flags.go:64] FLAG: --cluster-dns="[]" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667917 4744 flags.go:64] FLAG: --cluster-domain="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667927 4744 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667938 4744 flags.go:64] FLAG: --config-dir="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667948 4744 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667960 4744 flags.go:64] FLAG: --container-log-max-files="5" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667972 4744 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667983 4744 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.667994 4744 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668005 4744 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668015 4744 flags.go:64] FLAG: --contention-profiling="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668026 4744 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668036 4744 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668047 4744 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668058 4744 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668071 4744 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668081 4744 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668092 4744 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668102 4744 flags.go:64] FLAG: --enable-load-reader="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668114 4744 flags.go:64] FLAG: --enable-server="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668124 4744 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668138 4744 flags.go:64] FLAG: --event-burst="100" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668147 4744 flags.go:64] FLAG: --event-qps="50" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668156 4744 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668165 4744 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668173 4744 flags.go:64] FLAG: --eviction-hard="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668184 4744 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668194 4744 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668203 4744 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668212 4744 flags.go:64] FLAG: --eviction-soft="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668221 4744 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668230 4744 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668239 4744 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668248 4744 flags.go:64] FLAG: --experimental-mounter-path="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668256 4744 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668265 4744 flags.go:64] FLAG: --fail-swap-on="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668274 4744 flags.go:64] FLAG: --feature-gates="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668285 4744 flags.go:64] FLAG: --file-check-frequency="20s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668294 4744 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668303 4744 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668313 4744 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668321 4744 flags.go:64] FLAG: --healthz-port="10248" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668330 4744 flags.go:64] FLAG: --help="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668342 4744 flags.go:64] FLAG: --hostname-override="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668350 4744 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668359 4744 flags.go:64] FLAG: --http-check-frequency="20s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668374 4744 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668383 4744 flags.go:64] FLAG: --image-credential-provider-config="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668392 4744 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668400 4744 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668409 4744 flags.go:64] FLAG: --image-service-endpoint="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668418 4744 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668427 4744 flags.go:64] FLAG: --kube-api-burst="100" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668436 4744 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668445 4744 flags.go:64] FLAG: --kube-api-qps="50" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668455 4744 flags.go:64] FLAG: --kube-reserved="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668464 4744 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668473 4744 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668484 4744 flags.go:64] FLAG: --kubelet-cgroups="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668493 4744 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668502 4744 flags.go:64] FLAG: --lock-file="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668540 4744 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668550 4744 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668560 4744 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668574 4744 flags.go:64] FLAG: --log-json-split-stream="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668583 4744 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668592 4744 flags.go:64] FLAG: --log-text-split-stream="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668601 4744 flags.go:64] FLAG: --logging-format="text" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668611 4744 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668620 4744 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668630 4744 flags.go:64] FLAG: --manifest-url="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668638 4744 flags.go:64] FLAG: --manifest-url-header="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668650 4744 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668659 4744 flags.go:64] FLAG: --max-open-files="1000000" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668671 4744 flags.go:64] FLAG: --max-pods="110" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668680 4744 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668690 4744 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668700 4744 flags.go:64] FLAG: --memory-manager-policy="None" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668711 4744 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668722 4744 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668732 4744 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668742 4744 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668763 4744 flags.go:64] FLAG: --node-status-max-images="50" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668772 4744 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668782 4744 flags.go:64] FLAG: --oom-score-adj="-999" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668791 4744 flags.go:64] FLAG: --pod-cidr="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668800 4744 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668815 4744 flags.go:64] FLAG: --pod-manifest-path="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668824 4744 flags.go:64] FLAG: --pod-max-pids="-1" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668834 4744 flags.go:64] FLAG: --pods-per-core="0" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668842 4744 flags.go:64] FLAG: --port="10250" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668853 4744 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668862 4744 flags.go:64] FLAG: --provider-id="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668871 4744 flags.go:64] FLAG: --qos-reserved="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668905 4744 flags.go:64] FLAG: --read-only-port="10255" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668914 4744 flags.go:64] FLAG: --register-node="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668924 4744 flags.go:64] FLAG: --register-schedulable="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668932 4744 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668947 4744 flags.go:64] FLAG: --registry-burst="10" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668956 4744 flags.go:64] FLAG: --registry-qps="5" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668965 4744 flags.go:64] FLAG: --reserved-cpus="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668973 4744 flags.go:64] FLAG: --reserved-memory="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668984 4744 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.668993 4744 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669002 4744 flags.go:64] FLAG: --rotate-certificates="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669011 4744 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669020 4744 flags.go:64] FLAG: --runonce="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669028 4744 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669037 4744 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669046 4744 flags.go:64] FLAG: --seccomp-default="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669055 4744 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669067 4744 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669076 4744 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669085 4744 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669094 4744 flags.go:64] FLAG: --storage-driver-password="root" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669103 4744 flags.go:64] FLAG: --storage-driver-secure="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669112 4744 flags.go:64] FLAG: --storage-driver-table="stats" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669120 4744 flags.go:64] FLAG: --storage-driver-user="root" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669130 4744 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669139 4744 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669148 4744 flags.go:64] FLAG: --system-cgroups="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669157 4744 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669171 4744 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669180 4744 flags.go:64] FLAG: --tls-cert-file="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669188 4744 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669199 4744 flags.go:64] FLAG: --tls-min-version="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669207 4744 flags.go:64] FLAG: --tls-private-key-file="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669217 4744 flags.go:64] FLAG: --topology-manager-policy="none" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669226 4744 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669235 4744 flags.go:64] FLAG: --topology-manager-scope="container" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669244 4744 flags.go:64] FLAG: --v="2" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669256 4744 flags.go:64] FLAG: --version="false" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669302 4744 flags.go:64] FLAG: --vmodule="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669313 4744 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.669323 4744 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669564 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669578 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669588 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669598 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669607 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669615 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669625 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669633 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669644 4744 feature_gate.go:330] unrecognized feature gate: Example Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669652 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669660 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669668 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669676 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669683 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669693 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669702 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669712 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669721 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669729 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669738 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669746 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669754 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669763 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669772 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669782 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669793 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669801 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669810 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669820 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669829 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669837 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669844 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669852 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669860 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669867 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669876 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669885 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669893 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669901 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669909 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669920 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669928 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669936 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669946 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669956 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669964 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669972 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669980 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669988 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.669996 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670004 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670012 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670020 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670027 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670035 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670042 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670050 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670058 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670066 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670073 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670081 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670088 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670096 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670104 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670113 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670129 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670137 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670145 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670152 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670160 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.670168 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.670193 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.684938 4744 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.685022 4744 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685207 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685248 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685261 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685274 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685289 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685300 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685310 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685320 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685331 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685342 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685352 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685362 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685372 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685381 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685391 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685402 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685412 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685421 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685433 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685446 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685458 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685474 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685488 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685501 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685542 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685555 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685567 4744 feature_gate.go:330] unrecognized feature gate: Example Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685578 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685587 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685601 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685613 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685625 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685636 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685647 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685660 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685671 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685681 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685691 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685701 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685710 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685724 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685737 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685747 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685757 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685768 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685778 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685788 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685799 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685809 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685819 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685829 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685839 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685851 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685861 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685875 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685885 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685895 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685905 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685915 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685925 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685937 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685948 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685958 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685967 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685978 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685989 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.685999 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686009 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686019 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686029 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686041 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.686059 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686359 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686383 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686395 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686407 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686420 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686431 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686442 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686454 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686464 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686476 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686486 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686496 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686506 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686548 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686559 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686569 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686583 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686598 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686610 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686621 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686632 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686642 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686653 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686666 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686679 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686690 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686700 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686712 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686722 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686731 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686742 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686751 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686761 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686770 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686782 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686799 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686810 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686820 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686830 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686840 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686850 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686860 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686870 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686881 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686890 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686903 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686916 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686928 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686940 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686951 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686961 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686972 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686982 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.686993 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687002 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687012 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687022 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687032 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687041 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687051 4744 feature_gate.go:330] unrecognized feature gate: Example Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687061 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687071 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687081 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687091 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687100 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687110 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687120 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687131 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687141 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687151 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.687162 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.687179 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.688662 4744 server.go:940] "Client rotation is on, will bootstrap in background" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.694749 4744 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.700844 4744 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.701024 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.703083 4744 server.go:997] "Starting client certificate rotation" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.703129 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.703370 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.732483 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.737892 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.740574 4744 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.759729 4744 log.go:25] "Validated CRI v1 runtime API" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.802204 4744 log.go:25] "Validated CRI v1 image API" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.804995 4744 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.813240 4744 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-11-00-49-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.813301 4744 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.841769 4744 manager.go:217] Machine: {Timestamp:2026-03-11 00:54:03.838110986 +0000 UTC m=+0.642328671 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b27597af-c36d-4084-a073-6dfdbb017181 BootID:51e320bc-e184-46b0-b151-baf1fef55472 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:75:9f:d7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:75:9f:d7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:30:05:dc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1f:7a:17 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7a:7f:ee Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f0:b1:80 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:6b:b9:9a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:e9:72:b2:c5:f1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ba:ce:65:b9:c6:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.842188 4744 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.842382 4744 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.844356 4744 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.844699 4744 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.844756 4744 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.845138 4744 topology_manager.go:138] "Creating topology manager with none policy" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.845159 4744 container_manager_linux.go:303] "Creating device plugin manager" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.845984 4744 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.846048 4744 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.846852 4744 state_mem.go:36] "Initialized new in-memory state store" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.847002 4744 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.851401 4744 kubelet.go:418] "Attempting to sync node with API server" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.851442 4744 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.851494 4744 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.851546 4744 kubelet.go:324] "Adding apiserver pod source" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.851567 4744 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.862543 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.862905 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.862908 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.863722 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.863937 4744 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.865078 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.866677 4744 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868756 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868844 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868865 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868879 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868904 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868919 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868934 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868956 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868975 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.868989 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.869008 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.869022 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.872333 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.873166 4744 server.go:1280] "Started kubelet" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.874582 4744 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.874590 4744 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 11 00:54:03 crc systemd[1]: Started Kubernetes Kubelet. Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.876080 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.877168 4744 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.877199 4744 server.go:460] "Adding debug handlers to kubelet server" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.879019 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.879167 4744 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.881569 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.881670 4744 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.881684 4744 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.881800 4744 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.882232 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="200ms" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.882494 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.882621 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.883782 4744 factory.go:55] Registering systemd factory Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.883839 4744 factory.go:221] Registration of the systemd container factory successfully Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.884618 4744 factory.go:153] Registering CRI-O factory Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.884664 4744 factory.go:221] Registration of the crio container factory successfully Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.884796 4744 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.884878 4744 factory.go:103] Registering Raw factory Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.884913 4744 manager.go:1196] Started watching for new ooms in manager Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.889938 4744 manager.go:319] Starting recovery of all containers Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.884705 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.912863 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.913368 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.913616 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.913800 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.914012 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.914181 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.914339 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916666 4744 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916780 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916865 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916892 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916920 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916951 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.916975 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917078 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917104 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917127 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917151 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917174 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917197 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917218 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917240 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917263 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917283 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917305 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917372 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917400 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917431 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917456 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917480 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917502 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917578 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917611 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917642 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917668 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917703 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917737 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917823 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917854 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917889 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917916 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917947 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.917975 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918007 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918036 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918064 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918109 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918151 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918196 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918232 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918262 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918285 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918314 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918347 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918373 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918398 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918422 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918449 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918471 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918493 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918544 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918568 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918590 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918626 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918656 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918686 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918711 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918750 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918793 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918836 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918865 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918887 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918909 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918930 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918951 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918972 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.918997 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919093 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919116 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919140 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919162 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919186 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919208 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919233 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919262 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919292 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919324 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919348 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919373 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919398 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919434 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919470 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919564 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919587 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919630 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919750 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919772 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919798 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919828 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919857 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919882 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919907 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.919936 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920001 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920030 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920177 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920206 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920228 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920252 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920286 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920325 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920414 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920469 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920567 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920609 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920647 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920681 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920713 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920742 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920847 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920877 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.920949 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921047 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921082 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921110 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921214 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921271 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921336 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921363 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921390 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921412 4744 manager.go:324] Recovery completed Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921416 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921486 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921540 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921590 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921612 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921661 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921682 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921739 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921769 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921797 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921864 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921897 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921926 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.921984 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922005 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922027 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922048 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922079 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922101 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922122 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922272 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922319 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922339 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922390 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922463 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922497 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922619 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922640 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922718 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922774 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922802 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922832 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922852 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922923 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922944 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922964 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.922986 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923104 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923130 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923184 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923227 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923302 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923406 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923436 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923567 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923622 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923650 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923680 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923757 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923786 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923812 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923849 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923886 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923943 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.923970 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924026 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924060 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924097 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924124 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924168 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924219 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924696 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924717 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924739 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924760 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924813 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924837 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924864 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924891 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.924931 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925016 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925126 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925218 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925265 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925286 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925304 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925367 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925462 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925485 4744 reconstruct.go:97] "Volume reconstruction finished" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.925499 4744 reconciler.go:26] "Reconciler: start to sync state" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.935490 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.937666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.937778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.937799 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.939034 4744 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.939057 4744 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.939081 4744 state_mem.go:36] "Initialized new in-memory state store" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.951723 4744 policy_none.go:49] "None policy: Start" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.952936 4744 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.952968 4744 state_mem.go:35] "Initializing new in-memory state store" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.968863 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.973379 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.973460 4744 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 11 00:54:03 crc kubenswrapper[4744]: I0311 00:54:03.973501 4744 kubelet.go:2335] "Starting kubelet main sync loop" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.973585 4744 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 11 00:54:03 crc kubenswrapper[4744]: W0311 00:54:03.974605 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.974726 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:03 crc kubenswrapper[4744]: E0311 00:54:03.981824 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.017302 4744 manager.go:334] "Starting Device Plugin manager" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.017390 4744 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.017408 4744 server.go:79] "Starting device plugin registration server" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.017996 4744 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.018019 4744 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.018229 4744 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.018592 4744 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.018622 4744 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.031904 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.074569 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.074790 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.077164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.077243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.077265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.077578 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.077912 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.078014 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.079705 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.079745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.079764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.081650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.081704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.081802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.082157 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.083191 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.083238 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.084017 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="400ms" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.086970 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.087037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.087057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.087829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.087897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.087920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.088301 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.089594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.089663 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092478 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.092613 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.093000 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.093053 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095830 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.095859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.096139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.096191 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.097769 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.097838 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.097866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.118153 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.119368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.119433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.119456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.119500 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.120493 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.127841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.127906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.127960 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.128013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.128113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.128171 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.128270 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.229937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230120 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230362 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230463 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230424 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230576 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231107 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.230934 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231259 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231388 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231553 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.231810 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.321401 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.323155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.323213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.323231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.323271 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.323969 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333422 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333611 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.333747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334023 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334166 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334240 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.334350 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.429468 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.438238 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.463399 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.485400 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="800ms" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.487751 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: W0311 00:54:04.493002 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e49d1401d774992aee490db7d0ae11e353ed41dc2de6322c41af7306d5d436c9 WatchSource:0}: Error finding container e49d1401d774992aee490db7d0ae11e353ed41dc2de6322c41af7306d5d436c9: Status 404 returned error can't find the container with id e49d1401d774992aee490db7d0ae11e353ed41dc2de6322c41af7306d5d436c9 Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.496017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 00:54:04 crc kubenswrapper[4744]: W0311 00:54:04.509011 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-23970c2abd3f2029362e1f07dc9d7b375cbc32ef9ebb7256b0b0d8554d318052 WatchSource:0}: Error finding container 23970c2abd3f2029362e1f07dc9d7b375cbc32ef9ebb7256b0b0d8554d318052: Status 404 returned error can't find the container with id 23970c2abd3f2029362e1f07dc9d7b375cbc32ef9ebb7256b0b0d8554d318052 Mar 11 00:54:04 crc kubenswrapper[4744]: W0311 00:54:04.520323 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e72abbe4e437890959e91c80e8ac7838b739ebfebbb2a01f96e93e423a80f191 WatchSource:0}: Error finding container e72abbe4e437890959e91c80e8ac7838b739ebfebbb2a01f96e93e423a80f191: Status 404 returned error can't find the container with id e72abbe4e437890959e91c80e8ac7838b739ebfebbb2a01f96e93e423a80f191 Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.725130 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.727020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.727063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.727077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.727114 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.728058 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.876958 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:04 crc kubenswrapper[4744]: W0311 00:54:04.908208 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.908364 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:04 crc kubenswrapper[4744]: W0311 00:54:04.947881 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:04 crc kubenswrapper[4744]: E0311 00:54:04.948033 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.981117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e49d1401d774992aee490db7d0ae11e353ed41dc2de6322c41af7306d5d436c9"} Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.983353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"610574bc482a05381dfa39377b0a43855870c93e1ce4c08582d0397de022d14a"} Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.984994 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"77334561746ecc80b1e7e25cd2e22a819f37b9a6941da8121e04e30f325cd8cc"} Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.986508 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e72abbe4e437890959e91c80e8ac7838b739ebfebbb2a01f96e93e423a80f191"} Mar 11 00:54:04 crc kubenswrapper[4744]: I0311 00:54:04.987827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"23970c2abd3f2029362e1f07dc9d7b375cbc32ef9ebb7256b0b0d8554d318052"} Mar 11 00:54:05 crc kubenswrapper[4744]: E0311 00:54:05.286405 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="1.6s" Mar 11 00:54:05 crc kubenswrapper[4744]: W0311 00:54:05.346721 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:05 crc kubenswrapper[4744]: E0311 00:54:05.346803 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:05 crc kubenswrapper[4744]: W0311 00:54:05.403624 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:05 crc kubenswrapper[4744]: E0311 00:54:05.403738 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.528613 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.531542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.531612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.531632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.531678 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:05 crc kubenswrapper[4744]: E0311 00:54:05.532307 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.847727 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:54:05 crc kubenswrapper[4744]: E0311 00:54:05.849153 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.877645 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.993945 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748" exitCode=0 Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.994110 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748"} Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.994141 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.995580 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.995616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.995633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.997664 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36" exitCode=0 Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.997756 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36"} Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.997847 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.997883 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.999642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.999700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:05 crc kubenswrapper[4744]: I0311 00:54:05.999722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000222 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000811 4744 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038" exitCode=0 Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000924 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.000929 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038"} Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.002552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.002615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.002635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.004813 4744 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f" exitCode=0 Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.004863 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f"} Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.005028 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.006669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.006730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.006757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.009005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8"} Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.009073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154"} Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.009101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773"} Mar 11 00:54:06 crc kubenswrapper[4744]: I0311 00:54:06.878159 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:06 crc kubenswrapper[4744]: E0311 00:54:06.887959 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="3.2s" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.015909 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30" exitCode=0 Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.015993 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30"} Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.016140 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.018526 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.018564 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.018576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.019735 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.019732 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5"} Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.020689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.020733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.020755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.024758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389"} Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.035876 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.035893 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e"} Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.038268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.038312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.038325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.041637 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c"} Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.132918 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.135507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.135614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.135633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.135921 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:07 crc kubenswrapper[4744]: E0311 00:54:07.136781 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 11 00:54:07 crc kubenswrapper[4744]: W0311 00:54:07.309743 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:07 crc kubenswrapper[4744]: E0311 00:54:07.309839 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:07 crc kubenswrapper[4744]: W0311 00:54:07.400954 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:07 crc kubenswrapper[4744]: E0311 00:54:07.401065 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:07 crc kubenswrapper[4744]: W0311 00:54:07.810657 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:07 crc kubenswrapper[4744]: E0311 00:54:07.810802 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:07 crc kubenswrapper[4744]: W0311 00:54:07.837061 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:07 crc kubenswrapper[4744]: E0311 00:54:07.837155 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 11 00:54:07 crc kubenswrapper[4744]: I0311 00:54:07.877177 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.049835 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.049922 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.049974 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.051759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.051847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.051877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.057971 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5fa88cf531264ffcd2ba649948dcf77221d18e7e94bb7e8e039b4c9f197d464e"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.058028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.058050 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.058069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.058141 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.059464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.059566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.059593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.064396 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245" exitCode=0 Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.064496 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245"} Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.064593 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.064662 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.064664 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.065939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.065993 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.066014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067504 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.067686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:08 crc kubenswrapper[4744]: I0311 00:54:08.955930 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080135 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"852392ca4fef569658b15e593c3c30b1079a7f59b765ca2d9fc658c42249586a"} Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c1cfbe4f99dbf1bd16e0cd6732534be630dc6127842dcbf6ee0efd6d3b9d673"} Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080255 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080317 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cdb483b36dc9e6541351c051052844f192758b53d5ef7363a0b924be57ce9ca0"} Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.080363 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.087805 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:09 crc kubenswrapper[4744]: I0311 00:54:09.459922 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.091882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eba7e9bfb431d0fee5860ee8602254d4c66bac6529691d8f2639598cf30509d9"} Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.091963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ee528bcd7bebc942b3ac8d59654402bb3629e259d6ba425f763feb6103f0255"} Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.091989 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.092066 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.092154 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.093218 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.093684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.093732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.093751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.094933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.179176 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.337183 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.339268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.339327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.339346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.339389 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:10 crc kubenswrapper[4744]: I0311 00:54:10.457816 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.095934 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.096030 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.096061 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.097751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.097805 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.097825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.098340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.098419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:11 crc kubenswrapper[4744]: I0311 00:54:11.098446 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.652229 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.652557 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.654381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.654440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.654467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.849020 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.849247 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.851153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.851212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:12 crc kubenswrapper[4744]: I0311 00:54:12.851235 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:14 crc kubenswrapper[4744]: E0311 00:54:14.036001 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.131857 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.132185 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.133897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.134243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.134273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.139784 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:14 crc kubenswrapper[4744]: I0311 00:54:14.878463 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.111137 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.111470 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.113077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.113140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.113162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:15 crc kubenswrapper[4744]: I0311 00:54:15.116694 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.115048 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.116725 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.116798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.116820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.813375 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.813661 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.815356 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.815404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.815423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:16 crc kubenswrapper[4744]: I0311 00:54:16.967874 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.118256 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.120227 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.120353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.120381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.878645 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 00:54:17 crc kubenswrapper[4744]: I0311 00:54:17.878791 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.121261 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.122881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.122950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.122974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.323481 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:18 crc kubenswrapper[4744]: W0311 00:54:18.327662 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.327801 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:18 crc kubenswrapper[4744]: W0311 00:54:18.330919 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.331017 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.334117 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 11 00:54:18 crc kubenswrapper[4744]: W0311 00:54:18.334891 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.334962 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.341679 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.347161 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.357786 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.357904 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 00:54:18 crc kubenswrapper[4744]: W0311 00:54:18.358558 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:18 crc kubenswrapper[4744]: E0311 00:54:18.358668 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.358959 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.363042 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.363094 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 00:54:18 crc kubenswrapper[4744]: I0311 00:54:18.888304 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:18Z is after 2026-02-23T05:33:13Z Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.126796 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.129297 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5fa88cf531264ffcd2ba649948dcf77221d18e7e94bb7e8e039b4c9f197d464e" exitCode=255 Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.129381 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5fa88cf531264ffcd2ba649948dcf77221d18e7e94bb7e8e039b4c9f197d464e"} Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.148978 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.157053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.157132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.157158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.158260 4744 scope.go:117] "RemoveContainer" containerID="5fa88cf531264ffcd2ba649948dcf77221d18e7e94bb7e8e039b4c9f197d464e" Mar 11 00:54:19 crc kubenswrapper[4744]: I0311 00:54:19.883490 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:19Z is after 2026-02-23T05:33:13Z Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.134055 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.137995 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f"} Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.138280 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.139661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.139739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.139775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.466360 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:20 crc kubenswrapper[4744]: I0311 00:54:20.881823 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:20Z is after 2026-02-23T05:33:13Z Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.145355 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.146192 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.149896 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" exitCode=255 Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.150033 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f"} Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.150118 4744 scope.go:117] "RemoveContainer" containerID="5fa88cf531264ffcd2ba649948dcf77221d18e7e94bb7e8e039b4c9f197d464e" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.150137 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.151921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.151994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.152021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.153194 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:21 crc kubenswrapper[4744]: E0311 00:54:21.153623 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.159054 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:21 crc kubenswrapper[4744]: I0311 00:54:21.882265 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:21Z is after 2026-02-23T05:33:13Z Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.156176 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.159564 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.160962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.161030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.161056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.162031 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:22 crc kubenswrapper[4744]: E0311 00:54:22.162364 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.652785 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:22 crc kubenswrapper[4744]: I0311 00:54:22.882565 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:22Z is after 2026-02-23T05:33:13Z Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.162383 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.163957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.164025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.164045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.165169 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:23 crc kubenswrapper[4744]: E0311 00:54:23.165473 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:23 crc kubenswrapper[4744]: I0311 00:54:23.880055 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:23Z is after 2026-02-23T05:33:13Z Mar 11 00:54:24 crc kubenswrapper[4744]: E0311 00:54:24.036594 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.165749 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.167381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.167449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.167472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.168497 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:24 crc kubenswrapper[4744]: E0311 00:54:24.168836 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.641642 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:24 crc kubenswrapper[4744]: E0311 00:54:24.739754 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.747808 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.749428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.749480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.749499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.749562 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:24 crc kubenswrapper[4744]: E0311 00:54:24.754271 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:24 crc kubenswrapper[4744]: I0311 00:54:24.883082 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:24Z is after 2026-02-23T05:33:13Z Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.168922 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.170425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.170559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.170587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.171939 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:25 crc kubenswrapper[4744]: E0311 00:54:25.172363 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:25 crc kubenswrapper[4744]: W0311 00:54:25.250602 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:25Z is after 2026-02-23T05:33:13Z Mar 11 00:54:25 crc kubenswrapper[4744]: E0311 00:54:25.250753 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:25 crc kubenswrapper[4744]: I0311 00:54:25.881694 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:25Z is after 2026-02-23T05:33:13Z Mar 11 00:54:26 crc kubenswrapper[4744]: W0311 00:54:26.099104 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z Mar 11 00:54:26 crc kubenswrapper[4744]: E0311 00:54:26.099212 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:26 crc kubenswrapper[4744]: W0311 00:54:26.251610 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z Mar 11 00:54:26 crc kubenswrapper[4744]: E0311 00:54:26.251761 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.747418 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:54:26 crc kubenswrapper[4744]: E0311 00:54:26.753457 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.862319 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.862650 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.864429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.864485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.864504 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.882807 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:26Z is after 2026-02-23T05:33:13Z Mar 11 00:54:26 crc kubenswrapper[4744]: I0311 00:54:26.886447 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.174947 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.176726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.176797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.176816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.878995 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.880252 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 00:54:27 crc kubenswrapper[4744]: I0311 00:54:27.882591 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:27Z is after 2026-02-23T05:33:13Z Mar 11 00:54:28 crc kubenswrapper[4744]: E0311 00:54:28.329483 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:28Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:28 crc kubenswrapper[4744]: I0311 00:54:28.881056 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:28Z is after 2026-02-23T05:33:13Z Mar 11 00:54:29 crc kubenswrapper[4744]: W0311 00:54:29.490279 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:29Z is after 2026-02-23T05:33:13Z Mar 11 00:54:29 crc kubenswrapper[4744]: E0311 00:54:29.490405 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:29 crc kubenswrapper[4744]: I0311 00:54:29.882493 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:29Z is after 2026-02-23T05:33:13Z Mar 11 00:54:30 crc kubenswrapper[4744]: I0311 00:54:30.882481 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:30Z is after 2026-02-23T05:33:13Z Mar 11 00:54:31 crc kubenswrapper[4744]: E0311 00:54:31.745402 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.754732 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.757022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.757100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.757122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.757176 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:31 crc kubenswrapper[4744]: E0311 00:54:31.762948 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:31 crc kubenswrapper[4744]: I0311 00:54:31.881911 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:31Z is after 2026-02-23T05:33:13Z Mar 11 00:54:32 crc kubenswrapper[4744]: I0311 00:54:32.883707 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:32Z is after 2026-02-23T05:33:13Z Mar 11 00:54:33 crc kubenswrapper[4744]: I0311 00:54:33.881974 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:33Z is after 2026-02-23T05:33:13Z Mar 11 00:54:34 crc kubenswrapper[4744]: E0311 00:54:34.036835 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:34 crc kubenswrapper[4744]: I0311 00:54:34.880472 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:34Z is after 2026-02-23T05:33:13Z Mar 11 00:54:35 crc kubenswrapper[4744]: I0311 00:54:35.882033 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:35Z is after 2026-02-23T05:33:13Z Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.882940 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.973924 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.975775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.975841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.975862 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:36 crc kubenswrapper[4744]: I0311 00:54:36.976824 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.389960 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:57992->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.390052 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:57992->192.168.126.11:10357: read: connection reset by peer" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.390143 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.390353 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.392873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.392985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.393016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.394138 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.394457 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154" gracePeriod=30 Mar 11 00:54:37 crc kubenswrapper[4744]: I0311 00:54:37.881997 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:37Z is after 2026-02-23T05:33:13Z Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.227754 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.228578 4744 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154" exitCode=255 Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.228685 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154"} Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.228768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4"} Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.228962 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.230398 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.230478 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.230499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.234174 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.236814 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49"} Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.236998 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.238096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.238146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.238165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:38 crc kubenswrapper[4744]: E0311 00:54:38.334952 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:38Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:38 crc kubenswrapper[4744]: E0311 00:54:38.750594 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.763733 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.765284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.765316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.765329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.765363 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:38 crc kubenswrapper[4744]: E0311 00:54:38.769883 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:38 crc kubenswrapper[4744]: I0311 00:54:38.879658 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:38Z is after 2026-02-23T05:33:13Z Mar 11 00:54:39 crc kubenswrapper[4744]: W0311 00:54:39.171089 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:39Z is after 2026-02-23T05:33:13Z Mar 11 00:54:39 crc kubenswrapper[4744]: E0311 00:54:39.171234 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.242447 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.243369 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.246293 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" exitCode=255 Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.246366 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49"} Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.246439 4744 scope.go:117] "RemoveContainer" containerID="da09dec387a8345501d5e6a881acff04af465daa09ab4bf8ceafc6d345684e1f" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.246691 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.248085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.248155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.248179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.249233 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:54:39 crc kubenswrapper[4744]: E0311 00:54:39.249633 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:39 crc kubenswrapper[4744]: I0311 00:54:39.883190 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:39Z is after 2026-02-23T05:33:13Z Mar 11 00:54:40 crc kubenswrapper[4744]: I0311 00:54:40.251569 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 00:54:40 crc kubenswrapper[4744]: W0311 00:54:40.718591 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:40Z is after 2026-02-23T05:33:13Z Mar 11 00:54:40 crc kubenswrapper[4744]: E0311 00:54:40.719214 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:40 crc kubenswrapper[4744]: I0311 00:54:40.882282 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:40Z is after 2026-02-23T05:33:13Z Mar 11 00:54:41 crc kubenswrapper[4744]: I0311 00:54:41.882495 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:41Z is after 2026-02-23T05:33:13Z Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.652750 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.653025 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.655009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.655283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.655305 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.656913 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:54:42 crc kubenswrapper[4744]: E0311 00:54:42.658215 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:42 crc kubenswrapper[4744]: I0311 00:54:42.883626 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:42Z is after 2026-02-23T05:33:13Z Mar 11 00:54:43 crc kubenswrapper[4744]: I0311 00:54:43.882941 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:43Z is after 2026-02-23T05:33:13Z Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.012780 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:54:44 crc kubenswrapper[4744]: E0311 00:54:44.018844 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:44 crc kubenswrapper[4744]: E0311 00:54:44.020375 4744 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 11 00:54:44 crc kubenswrapper[4744]: E0311 00:54:44.037042 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.641962 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.642224 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.643933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.643989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.644008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.644876 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:54:44 crc kubenswrapper[4744]: E0311 00:54:44.645170 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.878194 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.878445 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.880070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.880135 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.880154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:44 crc kubenswrapper[4744]: I0311 00:54:44.882708 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:44Z is after 2026-02-23T05:33:13Z Mar 11 00:54:44 crc kubenswrapper[4744]: W0311 00:54:44.919453 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:44Z is after 2026-02-23T05:33:13Z Mar 11 00:54:44 crc kubenswrapper[4744]: E0311 00:54:44.919947 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:45 crc kubenswrapper[4744]: E0311 00:54:45.756657 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:45Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.770897 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.772753 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.772820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.772840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.772877 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:45 crc kubenswrapper[4744]: E0311 00:54:45.778441 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:45 crc kubenswrapper[4744]: I0311 00:54:45.882424 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:45Z is after 2026-02-23T05:33:13Z Mar 11 00:54:46 crc kubenswrapper[4744]: W0311 00:54:46.784209 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:46Z is after 2026-02-23T05:33:13Z Mar 11 00:54:46 crc kubenswrapper[4744]: E0311 00:54:46.784321 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.882494 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:46Z is after 2026-02-23T05:33:13Z Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.967914 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.968222 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.970099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.970166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:46 crc kubenswrapper[4744]: I0311 00:54:46.970185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:47 crc kubenswrapper[4744]: I0311 00:54:47.878487 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 00:54:47 crc kubenswrapper[4744]: I0311 00:54:47.878629 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 00:54:47 crc kubenswrapper[4744]: I0311 00:54:47.880364 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:47Z is after 2026-02-23T05:33:13Z Mar 11 00:54:48 crc kubenswrapper[4744]: E0311 00:54:48.340777 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:48Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:48 crc kubenswrapper[4744]: I0311 00:54:48.881620 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:48Z is after 2026-02-23T05:33:13Z Mar 11 00:54:49 crc kubenswrapper[4744]: I0311 00:54:49.882233 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:49Z is after 2026-02-23T05:33:13Z Mar 11 00:54:50 crc kubenswrapper[4744]: I0311 00:54:50.882218 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:50Z is after 2026-02-23T05:33:13Z Mar 11 00:54:51 crc kubenswrapper[4744]: I0311 00:54:51.883012 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:51Z is after 2026-02-23T05:33:13Z Mar 11 00:54:52 crc kubenswrapper[4744]: E0311 00:54:52.762148 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:52Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.779323 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.781381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.781450 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.781473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.781547 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:52 crc kubenswrapper[4744]: E0311 00:54:52.784541 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:52Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 00:54:52 crc kubenswrapper[4744]: I0311 00:54:52.883396 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:52Z is after 2026-02-23T05:33:13Z Mar 11 00:54:53 crc kubenswrapper[4744]: I0311 00:54:53.884091 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:53Z is after 2026-02-23T05:33:13Z Mar 11 00:54:54 crc kubenswrapper[4744]: E0311 00:54:54.037285 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:54:54 crc kubenswrapper[4744]: I0311 00:54:54.883742 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:54Z is after 2026-02-23T05:33:13Z Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.882633 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:55Z is after 2026-02-23T05:33:13Z Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.973962 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.975811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.975872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.975893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:55 crc kubenswrapper[4744]: I0311 00:54:55.976882 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:54:55 crc kubenswrapper[4744]: E0311 00:54:55.977200 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:54:56 crc kubenswrapper[4744]: I0311 00:54:56.884561 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:54:57 crc kubenswrapper[4744]: I0311 00:54:57.878977 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 00:54:57 crc kubenswrapper[4744]: I0311 00:54:57.879128 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 00:54:57 crc kubenswrapper[4744]: I0311 00:54:57.886096 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.349553 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3505e2c8f55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,LastTimestamp:2026-03-11 00:54:03.873111893 +0000 UTC m=+0.677329538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.355985 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.363745 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.370302 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.377059 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350670b52d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.021928664 +0000 UTC m=+0.826146279,LastTimestamp:2026-03-11 00:54:04.021928664 +0000 UTC m=+0.826146279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.386026 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.077218125 +0000 UTC m=+0.881435770,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.396023 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.077256375 +0000 UTC m=+0.881474020,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.403821 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.077275836 +0000 UTC m=+0.881493481,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.405847 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.079732996 +0000 UTC m=+0.883950641,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.412651 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.079756716 +0000 UTC m=+0.883974361,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.419612 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.079773867 +0000 UTC m=+0.883991512,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.426335 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.081673607 +0000 UTC m=+0.885891222,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.433542 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.081785129 +0000 UTC m=+0.886002744,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.439016 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.08180994 +0000 UTC m=+0.886027555,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.444871 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.087021745 +0000 UTC m=+0.891239390,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.451304 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.087049686 +0000 UTC m=+0.891267331,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.458420 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.087067706 +0000 UTC m=+0.891285341,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.464145 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.08788046 +0000 UTC m=+0.892098095,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.472161 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.08791063 +0000 UTC m=+0.892128265,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.478901 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.087930391 +0000 UTC m=+0.892148026,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.489758 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.092240882 +0000 UTC m=+0.896458527,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.496453 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.092267893 +0000 UTC m=+0.896485538,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.503488 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba3506207c3ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba3506207c3ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937809388 +0000 UTC m=+0.742027023,LastTimestamp:2026-03-11 00:54:04.092286883 +0000 UTC m=+0.896504528,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.509109 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba350620643ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba350620643ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937711087 +0000 UTC m=+0.741928732,LastTimestamp:2026-03-11 00:54:04.092427765 +0000 UTC m=+0.896645410,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.517493 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ba35062077b62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ba35062077b62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:03.937790818 +0000 UTC m=+0.742008453,LastTimestamp:2026-03-11 00:54:04.092468846 +0000 UTC m=+0.896686481,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.526683 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350836b9142 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.497998146 +0000 UTC m=+1.302215761,LastTimestamp:2026-03-11 00:54:04.497998146 +0000 UTC m=+1.302215761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.536704 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba350836ce3c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.498084807 +0000 UTC m=+1.302302452,LastTimestamp:2026-03-11 00:54:04.498084807 +0000 UTC m=+1.302302452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.545893 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba350836b22c9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.497969865 +0000 UTC m=+1.302187480,LastTimestamp:2026-03-11 00:54:04.497969865 +0000 UTC m=+1.302187480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.552328 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba35084ab89c7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.518967751 +0000 UTC m=+1.323185366,LastTimestamp:2026-03-11 00:54:04.518967751 +0000 UTC m=+1.323185366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.557051 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba35084f32400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:04.523660288 +0000 UTC m=+1.327877903,LastTimestamp:2026-03-11 00:54:04.523660288 +0000 UTC m=+1.327877903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.563267 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba350abe93c45 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.177322565 +0000 UTC m=+1.981540190,LastTimestamp:2026-03-11 00:54:05.177322565 +0000 UTC m=+1.981540190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.567672 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350abea3fc4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.177388996 +0000 UTC m=+1.981606631,LastTimestamp:2026-03-11 00:54:05.177388996 +0000 UTC m=+1.981606631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.573974 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba350abec7c79 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.177535609 +0000 UTC m=+1.981753244,LastTimestamp:2026-03-11 00:54:05.177535609 +0000 UTC m=+1.981753244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.580508 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba350abed031f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.177570079 +0000 UTC m=+1.981787674,LastTimestamp:2026-03-11 00:54:05.177570079 +0000 UTC m=+1.981787674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.585839 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba350abee49ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.177653741 +0000 UTC m=+1.981871356,LastTimestamp:2026-03-11 00:54:05.177653741 +0000 UTC m=+1.981871356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.589989 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba350acd00715 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.192447765 +0000 UTC m=+1.996665370,LastTimestamp:2026-03-11 00:54:05.192447765 +0000 UTC m=+1.996665370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.594482 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350ad187b8c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.197196172 +0000 UTC m=+2.001413777,LastTimestamp:2026-03-11 00:54:05.197196172 +0000 UTC m=+2.001413777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.598394 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350ad2d0a86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.198543494 +0000 UTC m=+2.002761099,LastTimestamp:2026-03-11 00:54:05.198543494 +0000 UTC m=+2.002761099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.602398 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba350ad392e32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.199339058 +0000 UTC m=+2.003556663,LastTimestamp:2026-03-11 00:54:05.199339058 +0000 UTC m=+2.003556663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.607096 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba350ad5e219b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.201760667 +0000 UTC m=+2.005978272,LastTimestamp:2026-03-11 00:54:05.201760667 +0000 UTC m=+2.005978272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.610902 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba350ad5fa29d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.201859229 +0000 UTC m=+2.006076844,LastTimestamp:2026-03-11 00:54:05.201859229 +0000 UTC m=+2.006076844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.617839 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350c2d089c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.561579975 +0000 UTC m=+2.365797590,LastTimestamp:2026-03-11 00:54:05.561579975 +0000 UTC m=+2.365797590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.624295 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350c3f586cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.580781261 +0000 UTC m=+2.384998896,LastTimestamp:2026-03-11 00:54:05.580781261 +0000 UTC m=+2.384998896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.629172 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350c40d8cd2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.582355666 +0000 UTC m=+2.386573301,LastTimestamp:2026-03-11 00:54:05.582355666 +0000 UTC m=+2.386573301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.633539 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350d2d0fb3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.830044477 +0000 UTC m=+2.634262092,LastTimestamp:2026-03-11 00:54:05.830044477 +0000 UTC m=+2.634262092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.640009 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350d3c1dde0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.845831136 +0000 UTC m=+2.650048771,LastTimestamp:2026-03-11 00:54:05.845831136 +0000 UTC m=+2.650048771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.644826 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350d3df4a3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.847759419 +0000 UTC m=+2.651977054,LastTimestamp:2026-03-11 00:54:05.847759419 +0000 UTC m=+2.651977054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.653249 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba350dccba284 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.997466244 +0000 UTC m=+2.801683879,LastTimestamp:2026-03-11 00:54:05.997466244 +0000 UTC m=+2.801683879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.658769 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba350dd2120ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.003069166 +0000 UTC m=+2.807286801,LastTimestamp:2026-03-11 00:54:06.003069166 +0000 UTC m=+2.807286801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.663749 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba350dd3a1a97 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.004705943 +0000 UTC m=+2.808923588,LastTimestamp:2026-03-11 00:54:06.004705943 +0000 UTC m=+2.808923588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.670887 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba350dd766178 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.008656248 +0000 UTC m=+2.812873893,LastTimestamp:2026-03-11 00:54:06.008656248 +0000 UTC m=+2.812873893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.677778 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350ff6789b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.578108849 +0000 UTC m=+3.382326454,LastTimestamp:2026-03-11 00:54:06.578108849 +0000 UTC m=+3.382326454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.684921 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba35101399156 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.608650582 +0000 UTC m=+3.412868197,LastTimestamp:2026-03-11 00:54:06.608650582 +0000 UTC m=+3.412868197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.690238 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba351039983ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.648493037 +0000 UTC m=+3.452710682,LastTimestamp:2026-03-11 00:54:06.648493037 +0000 UTC m=+3.452710682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.697144 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba35103c1c150 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.651130192 +0000 UTC m=+3.455347837,LastTimestamp:2026-03-11 00:54:06.651130192 +0000 UTC m=+3.455347837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.702250 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ba35105716e97 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.679420567 +0000 UTC m=+3.483638182,LastTimestamp:2026-03-11 00:54:06.679420567 +0000 UTC m=+3.483638182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.708071 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba35105b43a35 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.683798069 +0000 UTC m=+3.488015684,LastTimestamp:2026-03-11 00:54:06.683798069 +0000 UTC m=+3.488015684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.714447 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3510cbec319 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.801928985 +0000 UTC m=+3.606146590,LastTimestamp:2026-03-11 00:54:06.801928985 +0000 UTC m=+3.606146590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.720339 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3510d274f36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.808780598 +0000 UTC m=+3.612998203,LastTimestamp:2026-03-11 00:54:06.808780598 +0000 UTC m=+3.612998203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.726047 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3510d7618b8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.813943992 +0000 UTC m=+3.618161597,LastTimestamp:2026-03-11 00:54:06.813943992 +0000 UTC m=+3.618161597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.732475 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3510d8c11d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.815384017 +0000 UTC m=+3.619601622,LastTimestamp:2026-03-11 00:54:06.815384017 +0000 UTC m=+3.619601622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.737447 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3510ea94af4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.834076404 +0000 UTC m=+3.638294009,LastTimestamp:2026-03-11 00:54:06.834076404 +0000 UTC m=+3.638294009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.742147 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3510ecc858a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:06.836385162 +0000 UTC m=+3.640602767,LastTimestamp:2026-03-11 00:54:06.836385162 +0000 UTC m=+3.640602767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.748895 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba35119c1ae83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.020224131 +0000 UTC m=+3.824441746,LastTimestamp:2026-03-11 00:54:07.020224131 +0000 UTC m=+3.824441746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.756018 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3511cfd141d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.074448413 +0000 UTC m=+3.878666028,LastTimestamp:2026-03-11 00:54:07.074448413 +0000 UTC m=+3.878666028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.762870 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3511d3afc59 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.078505561 +0000 UTC m=+3.882723166,LastTimestamp:2026-03-11 00:54:07.078505561 +0000 UTC m=+3.882723166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.769404 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3511e92b9c6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.101032902 +0000 UTC m=+3.905250507,LastTimestamp:2026-03-11 00:54:07.101032902 +0000 UTC m=+3.905250507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.775721 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3511ea5e744 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.102289732 +0000 UTC m=+3.906507327,LastTimestamp:2026-03-11 00:54:07.102289732 +0000 UTC m=+3.906507327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.782704 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3511eb485d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.103247828 +0000 UTC m=+3.907465453,LastTimestamp:2026-03-11 00:54:07.103247828 +0000 UTC m=+3.907465453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.788745 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3511ec89e40 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.1045648 +0000 UTC m=+3.908782405,LastTimestamp:2026-03-11 00:54:07.1045648 +0000 UTC m=+3.908782405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.794724 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3512b40ced8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.313768152 +0000 UTC m=+4.117985757,LastTimestamp:2026-03-11 00:54:07.313768152 +0000 UTC m=+4.117985757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.801959 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3512b8fec61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.318953057 +0000 UTC m=+4.123170662,LastTimestamp:2026-03-11 00:54:07.318953057 +0000 UTC m=+4.123170662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.805133 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3512be8349e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.324738718 +0000 UTC m=+4.128956343,LastTimestamp:2026-03-11 00:54:07.324738718 +0000 UTC m=+4.128956343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.808348 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3512c535df2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.33176165 +0000 UTC m=+4.135979275,LastTimestamp:2026-03-11 00:54:07.33176165 +0000 UTC m=+4.135979275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.811810 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3512c769def openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.334071791 +0000 UTC m=+4.138289406,LastTimestamp:2026-03-11 00:54:07.334071791 +0000 UTC m=+4.138289406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.813844 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba3512c8833ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.3352243 +0000 UTC m=+4.139441915,LastTimestamp:2026-03-11 00:54:07.3352243 +0000 UTC m=+4.139441915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.819205 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ba3512ce7c567 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.341487463 +0000 UTC m=+4.145705078,LastTimestamp:2026-03-11 00:54:07.341487463 +0000 UTC m=+4.145705078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.826087 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351376668b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.517558961 +0000 UTC m=+4.321776566,LastTimestamp:2026-03-11 00:54:07.517558961 +0000 UTC m=+4.321776566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.830686 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351382618e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.530121448 +0000 UTC m=+4.334339063,LastTimestamp:2026-03-11 00:54:07.530121448 +0000 UTC m=+4.334339063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.836862 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351383f7ef3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.531785971 +0000 UTC m=+4.336003586,LastTimestamp:2026-03-11 00:54:07.531785971 +0000 UTC m=+4.336003586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.842918 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351442cf5c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.731897794 +0000 UTC m=+4.536115429,LastTimestamp:2026-03-11 00:54:07.731897794 +0000 UTC m=+4.536115429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.850058 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba35144fd56b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.7455541 +0000 UTC m=+4.549771735,LastTimestamp:2026-03-11 00:54:07.7455541 +0000 UTC m=+4.549771735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.856077 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba351584b9318 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.069448472 +0000 UTC m=+4.873666117,LastTimestamp:2026-03-11 00:54:08.069448472 +0000 UTC m=+4.873666117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.863068 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba35167b138af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.327768239 +0000 UTC m=+5.131985884,LastTimestamp:2026-03-11 00:54:08.327768239 +0000 UTC m=+5.131985884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.869303 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3516875b630 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.340645424 +0000 UTC m=+5.144863069,LastTimestamp:2026-03-11 00:54:08.340645424 +0000 UTC m=+5.144863069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.875064 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba35168902d66 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.342379878 +0000 UTC m=+5.146597513,LastTimestamp:2026-03-11 00:54:08.342379878 +0000 UTC m=+5.146597513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.881447 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.881889 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3517a328fea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.638234602 +0000 UTC m=+5.442452247,LastTimestamp:2026-03-11 00:54:08.638234602 +0000 UTC m=+5.442452247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.885119 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3517b67e8c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.658507969 +0000 UTC m=+5.462725604,LastTimestamp:2026-03-11 00:54:08.658507969 +0000 UTC m=+5.462725604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.889273 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3517b93285e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.661342302 +0000 UTC m=+5.465559907,LastTimestamp:2026-03-11 00:54:08.661342302 +0000 UTC m=+5.465559907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.894441 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3518b899490 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.929150096 +0000 UTC m=+5.733367741,LastTimestamp:2026-03-11 00:54:08.929150096 +0000 UTC m=+5.733367741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.899134 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3518c82b791 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.945477521 +0000 UTC m=+5.749695166,LastTimestamp:2026-03-11 00:54:08.945477521 +0000 UTC m=+5.749695166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.907018 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3518c9b7fb7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:08.947101623 +0000 UTC m=+5.751319258,LastTimestamp:2026-03-11 00:54:08.947101623 +0000 UTC m=+5.751319258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.916425 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3519cde0d34 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:09.219898676 +0000 UTC m=+6.024116311,LastTimestamp:2026-03-11 00:54:09.219898676 +0000 UTC m=+6.024116311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.923008 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3519e15f07e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:09.240338558 +0000 UTC m=+6.044556203,LastTimestamp:2026-03-11 00:54:09.240338558 +0000 UTC m=+6.044556203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.929388 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba3519e30c864 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:09.242097764 +0000 UTC m=+6.046315409,LastTimestamp:2026-03-11 00:54:09.242097764 +0000 UTC m=+6.046315409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.938159 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba351adb8a8dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:09.502660829 +0000 UTC m=+6.306878464,LastTimestamp:2026-03-11 00:54:09.502660829 +0000 UTC m=+6.306878464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.945181 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ba351aee68111 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:09.522442513 +0000 UTC m=+6.326660149,LastTimestamp:2026-03-11 00:54:09.522442513 +0000 UTC m=+6.326660149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.955245 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 00:54:58 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0f98a10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 00:54:58 crc kubenswrapper[4744]: body: Mar 11 00:54:58 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878743568 +0000 UTC m=+14.682961213,LastTimestamp:2026-03-11 00:54:17.878743568 +0000 UTC m=+14.682961213,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:58 crc kubenswrapper[4744]: > Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.962357 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0fb46c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878857411 +0000 UTC m=+14.683075056,LastTimestamp:2026-03-11 00:54:17.878857411 +0000 UTC m=+14.683075056,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.962503 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.962789 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.964458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.964547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:58 crc kubenswrapper[4744]: I0311 00:54:58.964566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.969088 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 00:54:58 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-apiserver-crc.189ba353bd886a33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 00:54:58 crc kubenswrapper[4744]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 00:54:58 crc kubenswrapper[4744]: Mar 11 00:54:58 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:18.357869107 +0000 UTC m=+15.162086722,LastTimestamp:2026-03-11 00:54:18.357869107 +0000 UTC m=+15.162086722,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:58 crc kubenswrapper[4744]: > Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.975630 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba353bd89e22f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:18.357965359 +0000 UTC m=+15.162182974,LastTimestamp:2026-03-11 00:54:18.357965359 +0000 UTC m=+15.162182974,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.983205 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ba353bd886a33\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 00:54:58 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-apiserver-crc.189ba353bd886a33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 00:54:58 crc kubenswrapper[4744]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 00:54:58 crc kubenswrapper[4744]: Mar 11 00:54:58 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:18.357869107 +0000 UTC m=+15.162086722,LastTimestamp:2026-03-11 00:54:18.363076672 +0000 UTC m=+15.167294287,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:58 crc kubenswrapper[4744]: > Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.990696 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ba353bd89e22f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba353bd89e22f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:18.357965359 +0000 UTC m=+15.162182974,LastTimestamp:2026-03-11 00:54:18.363142544 +0000 UTC m=+15.167360159,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:58 crc kubenswrapper[4744]: E0311 00:54:58.998081 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ba351383f7ef3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351383f7ef3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.531785971 +0000 UTC m=+4.336003586,LastTimestamp:2026-03-11 00:54:19.160022484 +0000 UTC m=+15.964240129,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.005646 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ba351442cf5c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba351442cf5c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.731897794 +0000 UTC m=+4.536115429,LastTimestamp:2026-03-11 00:54:19.583340402 +0000 UTC m=+16.387558027,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.019139 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ba35144fd56b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ba35144fd56b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:07.7455541 +0000 UTC m=+4.549771735,LastTimestamp:2026-03-11 00:54:19.691038593 +0000 UTC m=+16.495256198,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.024378 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba353a0f98a10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 00:54:59 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0f98a10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 00:54:59 crc kubenswrapper[4744]: body: Mar 11 00:54:59 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878743568 +0000 UTC m=+14.682961213,LastTimestamp:2026-03-11 00:54:27.880208311 +0000 UTC m=+24.684425996,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:59 crc kubenswrapper[4744]: > Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.030981 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba353a0fb46c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0fb46c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878857411 +0000 UTC m=+14.683075056,LastTimestamp:2026-03-11 00:54:27.880454577 +0000 UTC m=+24.684672222,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.040278 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 00:54:59 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-controller-manager-crc.189ba3582bf01aff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:57992->192.168.126.11:10357: read: connection reset by peer Mar 11 00:54:59 crc kubenswrapper[4744]: body: Mar 11 00:54:59 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:37.390027519 +0000 UTC m=+34.194245144,LastTimestamp:2026-03-11 00:54:37.390027519 +0000 UTC m=+34.194245144,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:59 crc kubenswrapper[4744]: > Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.047476 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba3582bf11db7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:57992->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:37.390093751 +0000 UTC m=+34.194311376,LastTimestamp:2026-03-11 00:54:37.390093751 +0000 UTC m=+34.194311376,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.053996 4744 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba3582c334739 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:37.394429753 +0000 UTC m=+34.198647398,LastTimestamp:2026-03-11 00:54:37.394429753 +0000 UTC m=+34.198647398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.061185 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba350ad2d0a86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350ad2d0a86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.198543494 +0000 UTC m=+2.002761099,LastTimestamp:2026-03-11 00:54:37.420168732 +0000 UTC m=+34.224386357,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.068957 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba350c2d089c7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350c2d089c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.561579975 +0000 UTC m=+2.365797590,LastTimestamp:2026-03-11 00:54:37.696672232 +0000 UTC m=+34.500889847,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.076181 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba350c3f586cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba350c3f586cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:05.580781261 +0000 UTC m=+2.384998896,LastTimestamp:2026-03-11 00:54:37.710354248 +0000 UTC m=+34.514571863,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.086569 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba353a0f98a10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 00:54:59 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0f98a10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 00:54:59 crc kubenswrapper[4744]: body: Mar 11 00:54:59 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878743568 +0000 UTC m=+14.682961213,LastTimestamp:2026-03-11 00:54:47.878589917 +0000 UTC m=+44.682807592,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:59 crc kubenswrapper[4744]: > Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.093971 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba353a0fb46c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0fb46c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878857411 +0000 UTC m=+14.683075056,LastTimestamp:2026-03-11 00:54:47.878670859 +0000 UTC m=+44.682888494,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.104302 4744 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ba353a0f98a10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 00:54:59 crc kubenswrapper[4744]: &Event{ObjectMeta:{kube-controller-manager-crc.189ba353a0f98a10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 00:54:59 crc kubenswrapper[4744]: body: Mar 11 00:54:59 crc kubenswrapper[4744]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:54:17.878743568 +0000 UTC m=+14.682961213,LastTimestamp:2026-03-11 00:54:57.879075618 +0000 UTC m=+54.683293263,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 00:54:59 crc kubenswrapper[4744]: > Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.769637 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.785200 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.787306 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.787548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.787730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.787927 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:54:59 crc kubenswrapper[4744]: E0311 00:54:59.795046 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 00:54:59 crc kubenswrapper[4744]: I0311 00:54:59.883114 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:00 crc kubenswrapper[4744]: I0311 00:55:00.882623 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:01 crc kubenswrapper[4744]: I0311 00:55:01.889322 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:02 crc kubenswrapper[4744]: I0311 00:55:02.884830 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:03 crc kubenswrapper[4744]: I0311 00:55:03.884041 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:04 crc kubenswrapper[4744]: E0311 00:55:04.037478 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:04 crc kubenswrapper[4744]: I0311 00:55:04.885864 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.502718 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.502893 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.504388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.504435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.504447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.510469 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:55:05 crc kubenswrapper[4744]: I0311 00:55:05.885298 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.334184 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.335650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.335698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.335708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:06 crc kubenswrapper[4744]: E0311 00:55:06.775537 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.796121 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.797728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.797798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.797818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.797862 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:55:06 crc kubenswrapper[4744]: E0311 00:55:06.803147 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 00:55:06 crc kubenswrapper[4744]: I0311 00:55:06.890672 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.882182 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.974742 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.977397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.977455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.977468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:07 crc kubenswrapper[4744]: I0311 00:55:07.978252 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.342716 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.344928 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09"} Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.345121 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.346232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.346277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.346287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:08 crc kubenswrapper[4744]: I0311 00:55:08.881787 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.350040 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.350793 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.353702 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" exitCode=255 Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.353776 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09"} Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.353882 4744 scope.go:117] "RemoveContainer" containerID="f05226ed47e0d3456b502ba975307afdc37df02206c574c7775b2e601250db49" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.354050 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.355242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.355290 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.355309 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.356356 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:09 crc kubenswrapper[4744]: E0311 00:55:09.356730 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:09 crc kubenswrapper[4744]: I0311 00:55:09.886120 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:10 crc kubenswrapper[4744]: I0311 00:55:10.359193 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 00:55:10 crc kubenswrapper[4744]: I0311 00:55:10.880726 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:11 crc kubenswrapper[4744]: I0311 00:55:11.884267 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.653155 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.653470 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.655782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.655994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.656150 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.657269 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:12 crc kubenswrapper[4744]: E0311 00:55:12.657784 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:12 crc kubenswrapper[4744]: I0311 00:55:12.883645 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:13 crc kubenswrapper[4744]: E0311 00:55:13.782491 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.803804 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.810347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.810426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.810450 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.810496 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:55:13 crc kubenswrapper[4744]: E0311 00:55:13.818496 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 00:55:13 crc kubenswrapper[4744]: I0311 00:55:13.884444 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:14 crc kubenswrapper[4744]: E0311 00:55:14.037654 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.641347 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.641602 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.643079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.643142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.643163 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.643973 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:14 crc kubenswrapper[4744]: E0311 00:55:14.644245 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:14 crc kubenswrapper[4744]: I0311 00:55:14.882855 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:15 crc kubenswrapper[4744]: I0311 00:55:15.883971 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:16 crc kubenswrapper[4744]: I0311 00:55:16.022081 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 00:55:16 crc kubenswrapper[4744]: I0311 00:55:16.041700 4744 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 00:55:16 crc kubenswrapper[4744]: I0311 00:55:16.883949 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:17 crc kubenswrapper[4744]: I0311 00:55:17.881153 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:18 crc kubenswrapper[4744]: W0311 00:55:18.775063 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 11 00:55:18 crc kubenswrapper[4744]: E0311 00:55:18.775558 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 11 00:55:18 crc kubenswrapper[4744]: I0311 00:55:18.884973 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 00:55:18 crc kubenswrapper[4744]: I0311 00:55:18.964143 4744 csr.go:261] certificate signing request csr-m6g4j is approved, waiting to be issued Mar 11 00:55:18 crc kubenswrapper[4744]: I0311 00:55:18.975450 4744 csr.go:257] certificate signing request csr-m6g4j is issued Mar 11 00:55:19 crc kubenswrapper[4744]: I0311 00:55:19.071242 4744 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 11 00:55:19 crc kubenswrapper[4744]: I0311 00:55:19.648782 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 00:55:19 crc kubenswrapper[4744]: I0311 00:55:19.702721 4744 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 11 00:55:19 crc kubenswrapper[4744]: W0311 00:55:19.703253 4744 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 11 00:55:19 crc kubenswrapper[4744]: I0311 00:55:19.978181 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-28 12:36:15.543000953 +0000 UTC Mar 11 00:55:19 crc kubenswrapper[4744]: I0311 00:55:19.978234 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6299h40m55.564772502s for next certificate rotation Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.818603 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.821930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.821990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.822008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.822248 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.836561 4744 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.837076 4744 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.837132 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.842883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.843003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.843025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.843054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.843074 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:20Z","lastTransitionTime":"2026-03-11T00:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.862821 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.874558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.874611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.874629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.874653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.874672 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:20Z","lastTransitionTime":"2026-03-11T00:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.889267 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.900155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.900241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.900261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.900289 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.900309 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:20Z","lastTransitionTime":"2026-03-11T00:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.915301 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.925768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.925842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.925904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.925934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:20 crc kubenswrapper[4744]: I0311 00:55:20.925955 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:20Z","lastTransitionTime":"2026-03-11T00:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.942102 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.942931 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:55:20 crc kubenswrapper[4744]: E0311 00:55:20.943141 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.043812 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.144822 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.245017 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.346048 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.447303 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.548257 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.648545 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.749744 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.850817 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:21 crc kubenswrapper[4744]: E0311 00:55:21.951876 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.052273 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.152432 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.252920 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.353902 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.454732 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.555233 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.655757 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.756150 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.856873 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:22 crc kubenswrapper[4744]: E0311 00:55:22.957944 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.058130 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.159091 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.259236 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.360344 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.460960 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.561145 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.661725 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.762507 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.863620 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:23 crc kubenswrapper[4744]: E0311 00:55:23.964079 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.037810 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.065029 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.165567 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.265746 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.366192 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.466811 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.567606 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.668031 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.768896 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.869332 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:24 crc kubenswrapper[4744]: E0311 00:55:24.970489 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.071437 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.171868 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.271997 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.372884 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.473902 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.574763 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.675749 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.776362 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.877494 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:25 crc kubenswrapper[4744]: E0311 00:55:25.978467 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.079111 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.179947 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: I0311 00:55:26.187637 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.280927 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.381913 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.482321 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.582462 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.682650 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.783780 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.884774 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:26 crc kubenswrapper[4744]: E0311 00:55:26.985462 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.086848 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.187855 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.288683 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.389796 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.490831 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.591315 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.709754 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.810302 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.911632 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:27 crc kubenswrapper[4744]: I0311 00:55:27.974072 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:27 crc kubenswrapper[4744]: I0311 00:55:27.976053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:27 crc kubenswrapper[4744]: I0311 00:55:27.976098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:27 crc kubenswrapper[4744]: I0311 00:55:27.976116 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:27 crc kubenswrapper[4744]: I0311 00:55:27.977064 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:27 crc kubenswrapper[4744]: E0311 00:55:27.977361 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.012122 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.112690 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.213848 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.314386 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.414563 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.515999 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.616681 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.717220 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.818181 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: E0311 00:55:28.918584 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:28 crc kubenswrapper[4744]: I0311 00:55:28.974592 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:28 crc kubenswrapper[4744]: I0311 00:55:28.976415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:28 crc kubenswrapper[4744]: I0311 00:55:28.976468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:28 crc kubenswrapper[4744]: I0311 00:55:28.976487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.019725 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.120106 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.222213 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.323737 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.424466 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.525498 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.626637 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.727335 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.828242 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:29 crc kubenswrapper[4744]: E0311 00:55:29.930262 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.030916 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.131595 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.232276 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.265861 4744 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.333314 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.434400 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.535207 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.635820 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.736129 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.837055 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.937850 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.962927 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.968589 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.968640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.968658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.968684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.968704 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:30Z","lastTransitionTime":"2026-03-11T00:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:30 crc kubenswrapper[4744]: E0311 00:55:30.984845 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.989632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.989681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.989699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.989723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:30 crc kubenswrapper[4744]: I0311 00:55:30.989742 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:30Z","lastTransitionTime":"2026-03-11T00:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.005605 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.010407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.010466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.010485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.010538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.010557 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:31Z","lastTransitionTime":"2026-03-11T00:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.026136 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.030926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.031001 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.031022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.031055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:31 crc kubenswrapper[4744]: I0311 00:55:31.031080 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:31Z","lastTransitionTime":"2026-03-11T00:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.047452 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.047720 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.047771 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.148490 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.249590 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.350013 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.450708 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.550999 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.652118 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.752593 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.853590 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:31 crc kubenswrapper[4744]: E0311 00:55:31.954053 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.054472 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.155644 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.256445 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.357370 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.458152 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.558622 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.659612 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.759732 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.860594 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:32 crc kubenswrapper[4744]: E0311 00:55:32.961408 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.061866 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.162930 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.263746 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.364811 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.465879 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.566696 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.667479 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.768310 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.869408 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:33 crc kubenswrapper[4744]: E0311 00:55:33.970141 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.038623 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.070914 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.171900 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.272899 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.373563 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.474491 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.575486 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.676546 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.777611 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.878585 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:34 crc kubenswrapper[4744]: E0311 00:55:34.979662 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.081092 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.182208 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.282971 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.383842 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.484941 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.586088 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.686216 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.786610 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.887567 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:35 crc kubenswrapper[4744]: E0311 00:55:35.988307 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.089415 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.190269 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.290592 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.391731 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.492147 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.593237 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.694114 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.794689 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.894811 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:36 crc kubenswrapper[4744]: E0311 00:55:36.995271 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.096195 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.196314 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.297043 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.397253 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.498095 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.598412 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.699589 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.799998 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:37 crc kubenswrapper[4744]: E0311 00:55:37.901248 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.002776 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.104058 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.204929 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.306016 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.406881 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.507696 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.609009 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.710091 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.810401 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:38 crc kubenswrapper[4744]: E0311 00:55:38.911459 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.012996 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.113783 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.214819 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.315219 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.416260 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.516632 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.617705 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.718691 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.819011 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:39 crc kubenswrapper[4744]: E0311 00:55:39.919167 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.019748 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.120703 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.221814 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.322339 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.423115 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.523721 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.624484 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.725082 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.825371 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.926127 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:40 crc kubenswrapper[4744]: I0311 00:55:40.974017 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:40 crc kubenswrapper[4744]: I0311 00:55:40.976101 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:40 crc kubenswrapper[4744]: I0311 00:55:40.976137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:40 crc kubenswrapper[4744]: I0311 00:55:40.976155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:40 crc kubenswrapper[4744]: I0311 00:55:40.977273 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:40 crc kubenswrapper[4744]: E0311 00:55:40.977659 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.027071 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.128161 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.150475 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.156426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.156477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.156498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.156576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.156599 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:41Z","lastTransitionTime":"2026-03-11T00:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.172474 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.177967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.178011 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.178031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.178059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.178077 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:41Z","lastTransitionTime":"2026-03-11T00:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.192824 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.197551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.197613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.197630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.197654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.197672 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:41Z","lastTransitionTime":"2026-03-11T00:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.213317 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.217763 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.217814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.217831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.217857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:41 crc kubenswrapper[4744]: I0311 00:55:41.217878 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:41Z","lastTransitionTime":"2026-03-11T00:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.235183 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.235350 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.235387 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.336075 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.437098 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.538086 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.638903 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.739045 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.839855 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:41 crc kubenswrapper[4744]: E0311 00:55:41.940746 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.041256 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.141641 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.241929 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.342430 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.443501 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.543828 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.644300 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.744708 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.845040 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:42 crc kubenswrapper[4744]: E0311 00:55:42.945912 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.047457 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.148015 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.248763 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.349290 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.449400 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.550327 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.651617 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.752894 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.853221 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:43 crc kubenswrapper[4744]: E0311 00:55:43.954237 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.039708 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.054346 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.154895 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.255669 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.356773 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.457409 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.558243 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.658983 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.759127 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.860507 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:44 crc kubenswrapper[4744]: E0311 00:55:44.961203 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.062017 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.163216 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.264297 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.364924 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.465142 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.565717 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.666064 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.766889 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.868415 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:45 crc kubenswrapper[4744]: E0311 00:55:45.969266 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.069644 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.169851 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.270370 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.370508 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.471508 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.572489 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.673606 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.774607 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.875322 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:46 crc kubenswrapper[4744]: E0311 00:55:46.975609 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.077048 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.177889 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.278664 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.379313 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.480569 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.581476 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.682227 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.783351 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:47 crc kubenswrapper[4744]: E0311 00:55:47.901999 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.002719 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.103691 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.204897 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.305759 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.405866 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.506432 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.606851 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.707734 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.807869 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:48 crc kubenswrapper[4744]: E0311 00:55:48.908550 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.008697 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.109770 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.210706 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.310936 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.411317 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.512364 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.612586 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.712768 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.813122 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:49 crc kubenswrapper[4744]: E0311 00:55:49.913637 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.014748 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.115342 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.216271 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.317507 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.418776 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.519805 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.620840 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.721471 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.821681 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:50 crc kubenswrapper[4744]: E0311 00:55:50.922501 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.022780 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.123884 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.224260 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.324830 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.425235 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.500195 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.506339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.506409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.506431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.506463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.506482 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:51Z","lastTransitionTime":"2026-03-11T00:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.522847 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.529099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.529294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.529443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.529646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.529805 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:51Z","lastTransitionTime":"2026-03-11T00:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.545633 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.550839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.551054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.551203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.551337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.551465 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:51Z","lastTransitionTime":"2026-03-11T00:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.567221 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.572348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.572612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.572773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.572937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:55:51 crc kubenswrapper[4744]: I0311 00:55:51.573098 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:55:51Z","lastTransitionTime":"2026-03-11T00:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.590762 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.590987 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.591038 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.691138 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.792494 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.892670 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:51 crc kubenswrapper[4744]: E0311 00:55:51.993457 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.094890 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.195680 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.296015 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.397230 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.497719 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.598637 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.698899 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.799981 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:52 crc kubenswrapper[4744]: E0311 00:55:52.900954 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.001616 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.102004 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.202123 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.303200 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.404376 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.505637 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.606549 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.707132 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.808087 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: E0311 00:55:53.908603 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:53 crc kubenswrapper[4744]: I0311 00:55:53.973914 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:53 crc kubenswrapper[4744]: I0311 00:55:53.976437 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:53 crc kubenswrapper[4744]: I0311 00:55:53.976647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:53 crc kubenswrapper[4744]: I0311 00:55:53.976813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:53 crc kubenswrapper[4744]: I0311 00:55:53.978011 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.009692 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.040252 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.110357 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.211313 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.312061 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.413508 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.499985 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.502626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e"} Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.502846 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.504582 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.504646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:54 crc kubenswrapper[4744]: I0311 00:55:54.504668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.514413 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.615572 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.716625 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.817658 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:54 crc kubenswrapper[4744]: E0311 00:55:54.918434 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.018628 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.118836 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.219940 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.320451 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.421060 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.507376 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.508368 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.510950 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" exitCode=255 Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.511012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e"} Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.511071 4744 scope.go:117] "RemoveContainer" containerID="da0b8a54609bfac02c7fac32aa814351cea89b342ba3bd610c6b88e31610ff09" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.511316 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.512743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.512785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.512802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:55 crc kubenswrapper[4744]: I0311 00:55:55.513724 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.514177 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.544050 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.644451 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.744651 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.844835 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:55 crc kubenswrapper[4744]: E0311 00:55:55.945100 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.046278 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.147549 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.248100 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.349095 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.450248 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: I0311 00:55:56.517209 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.550803 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.651596 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.752590 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.853762 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: E0311 00:55:56.954614 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:56 crc kubenswrapper[4744]: I0311 00:55:56.974508 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:55:56 crc kubenswrapper[4744]: I0311 00:55:56.976292 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:55:56 crc kubenswrapper[4744]: I0311 00:55:56.976340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:55:56 crc kubenswrapper[4744]: I0311 00:55:56.976360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.055174 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.156190 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.256618 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.357551 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.458609 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.559203 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.660243 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.761309 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.862346 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:57 crc kubenswrapper[4744]: E0311 00:55:57.963401 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.064365 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.165135 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.266196 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.366463 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.467555 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.567825 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.668390 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.769598 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.870565 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:58 crc kubenswrapper[4744]: E0311 00:55:58.971139 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.071883 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.172505 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.273655 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.374593 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.474919 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.575452 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.676403 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.777617 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.878446 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:55:59 crc kubenswrapper[4744]: E0311 00:55:59.979280 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.080315 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.180978 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.281328 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.381992 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.482163 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.582404 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.683105 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.783367 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.883563 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:00 crc kubenswrapper[4744]: E0311 00:56:00.983737 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.084654 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.184919 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.285920 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.386086 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.404499 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.487222 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.587377 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.688212 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.737421 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.749286 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.749350 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.749368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.749392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.749410 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:01Z","lastTransitionTime":"2026-03-11T00:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.764856 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.772167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.772222 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.772243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.772268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.772285 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:01Z","lastTransitionTime":"2026-03-11T00:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.786852 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.791935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.791996 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.792014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.792042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.792060 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:01Z","lastTransitionTime":"2026-03-11T00:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.807278 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.811667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.811744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.811768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.811800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:01 crc kubenswrapper[4744]: I0311 00:56:01.811825 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:01Z","lastTransitionTime":"2026-03-11T00:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.828908 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.829136 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.829188 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:01 crc kubenswrapper[4744]: E0311 00:56:01.930130 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.031192 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.132175 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.233224 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.333995 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.435107 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.535959 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.636932 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.653324 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.653636 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.655478 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.655571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.655596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:02 crc kubenswrapper[4744]: I0311 00:56:02.656668 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.656967 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.738093 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.839174 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:02 crc kubenswrapper[4744]: E0311 00:56:02.939657 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.040260 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.141314 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.242223 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.343184 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.443539 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.544703 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.645472 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.746586 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.847391 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 00:56:03 crc kubenswrapper[4744]: E0311 00:56:03.947865 4744 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 11 00:56:04 crc kubenswrapper[4744]: E0311 00:56:04.040729 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 00:56:04 crc kubenswrapper[4744]: E0311 00:56:04.053364 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.641389 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.641747 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.643616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.643659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.643671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:04 crc kubenswrapper[4744]: I0311 00:56:04.644350 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:04 crc kubenswrapper[4744]: E0311 00:56:04.644674 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:09 crc kubenswrapper[4744]: E0311 00:56:09.055769 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.629775 4744 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.923049 4744 apiserver.go:52] "Watching apiserver" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.930789 4744 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.932000 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-678nx","openshift-dns/node-resolver-6ghqv","openshift-multus/multus-additional-cni-plugins-sj4cl","openshift-multus/network-metrics-daemon-tdnf7","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-node-78fcc","openshift-image-registry/node-ca-8z8gf","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw","openshift-multus/multus-xlclh","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.932760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.932785 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:11 crc kubenswrapper[4744]: E0311 00:56:11.933173 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.933057 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:11 crc kubenswrapper[4744]: E0311 00:56:11.933504 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.933874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.933967 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.934187 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:11 crc kubenswrapper[4744]: E0311 00:56:11.934301 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.934715 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:11 crc kubenswrapper[4744]: E0311 00:56:11.934828 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935086 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935404 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xlclh" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935541 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.935730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.941815 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.942242 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.942805 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.944973 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945154 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945108 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945269 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945414 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945450 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945573 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945600 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945610 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.945642 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.947919 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.948356 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.948370 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.949066 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.949949 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.950565 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951023 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951240 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951622 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951657 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951671 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.951643 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954051 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954345 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954361 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954493 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954638 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954710 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.955400 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.954955 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.955048 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.955139 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.955230 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.956019 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.961895 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.983688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.990229 4744 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.995809 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 11 00:56:11 crc kubenswrapper[4744]: I0311 00:56:11.999160 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.014270 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.030294 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.044347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.059038 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.062571 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.062634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.062676 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.062724 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.062809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.063091 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.063804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.063900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.063954 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.063996 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064043 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064129 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064178 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064218 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064256 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064297 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064331 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064333 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064371 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064391 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064409 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064593 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064761 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064785 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064913 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064925 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.064960 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065014 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065059 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065095 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065129 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065174 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065211 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065250 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065294 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065334 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065374 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065408 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065412 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065445 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065484 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065502 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065552 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065626 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065784 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065843 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065881 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065870 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065917 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065955 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065882 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065996 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.065892 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066109 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066154 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066176 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066198 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066241 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066279 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066328 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066408 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066201 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066623 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066433 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066462 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066590 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067011 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067051 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067137 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067186 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067294 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.066684 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067497 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067571 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067612 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067647 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067678 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067708 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067740 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067740 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067775 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067777 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067825 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067846 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067883 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067929 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.067958 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068085 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068122 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068158 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068194 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068226 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068259 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068369 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068434 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068508 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068569 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068638 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068696 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068736 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068771 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068883 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068921 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068955 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.068993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069030 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069065 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069096 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069134 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069173 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069212 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069316 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069336 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069359 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069377 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069406 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069445 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069487 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069557 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069572 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069604 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069645 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069681 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069717 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069750 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069787 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069818 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069630 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069892 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069962 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.069998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070033 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070109 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070144 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070180 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070217 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070322 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070364 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070403 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070441 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070480 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070536 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070575 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070643 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070715 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070750 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070788 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070829 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070868 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070906 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070945 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070990 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071021 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071060 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071102 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071173 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071246 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071277 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071306 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071400 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071434 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071473 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071534 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071575 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071611 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071647 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071685 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071726 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071764 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071794 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071823 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071878 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071912 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070248 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070559 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071014 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071026 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.070694 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071672 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071788 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.071864 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.071969 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.571935617 +0000 UTC m=+129.376153262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072655 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072671 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072743 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072814 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072879 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072920 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.072984 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073289 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073428 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073493 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073589 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073655 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073715 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073774 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073834 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073896 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.073955 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074014 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074075 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074212 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074273 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074336 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074388 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.075993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076067 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076361 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a15dc7ac-7c34-4135-b6eb-a85122800ce9-mcd-auth-proxy-config\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076483 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076654 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9zj\" (UniqueName: \"kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076719 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076775 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-system-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076833 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-kubelet\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076999 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077123 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92f4c9df-1087-4820-8e07-1120f02df454-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077198 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080297 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-etc-kubernetes\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080416 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2vm\" (UniqueName: \"kubernetes.io/projected/e146399d-0685-4ea8-96e4-c76c9478a23a-kube-api-access-8h2vm\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080489 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080550 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080592 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080667 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-cnibin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080703 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqmk\" (UniqueName: \"kubernetes.io/projected/e16bf0f3-533b-4114-89c6-195a85273e98-kube-api-access-pgqmk\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080785 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080883 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080920 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081068 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-cni-binary-copy\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-socket-dir-parent\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081142 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081187 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gczk\" (UniqueName: \"kubernetes.io/projected/ad8329c6-d511-446f-b617-99778d12b878-kube-api-access-7gczk\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081222 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkpvs\" (UniqueName: \"kubernetes.io/projected/92f4c9df-1087-4820-8e07-1120f02df454-kube-api-access-lkpvs\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a15dc7ac-7c34-4135-b6eb-a85122800ce9-proxy-tls\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081299 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9t5q\" (UniqueName: \"kubernetes.io/projected/a15dc7ac-7c34-4135-b6eb-a85122800ce9-kube-api-access-d9t5q\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081336 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-hostroot\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081371 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081545 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081582 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a15dc7ac-7c34-4135-b6eb-a85122800ce9-rootfs\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081609 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-bin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081679 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082056 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074017 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.084102 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074082 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.074267 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.075503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.075553 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076692 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.076997 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077701 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077954 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.077964 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.078023 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.084796 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.584760225 +0000 UTC m=+129.388977920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085052 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.078899 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.078117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.079156 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.079737 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.079715 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.079994 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080170 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.080191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081260 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081797 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081811 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.081830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082264 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082319 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082623 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082960 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.082976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.084139 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.084171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.084331 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.084290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085844 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad8329c6-d511-446f-b617-99778d12b878-host\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085836 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085912 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.085984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.086052 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrjx\" (UniqueName: \"kubernetes.io/projected/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-kube-api-access-fqrjx\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.086226 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.086768 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088153 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.086949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.087030 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.087564 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.087750 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.087929 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088086 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088431 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088436 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.089037 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.089273 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.088960 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-multus\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.089545 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090286 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090411 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-multus-certs\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090463 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-multus-daemon-config\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090568 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090610 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090664 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e146399d-0685-4ea8-96e4-c76c9478a23a-hosts-file\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090872 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090909 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090946 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.090984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-system-cni-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091020 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-os-release\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091064 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091104 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad8329c6-d511-446f-b617-99778d12b878-serviceca\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091138 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-os-release\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091247 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-k8s-cni-cncf-io\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091281 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-netns\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-conf-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091352 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th54b\" (UniqueName: \"kubernetes.io/projected/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-kube-api-access-th54b\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091394 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cnibin\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091642 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091669 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091692 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091713 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.091791 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.092628 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.092658 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.092744 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.592717639 +0000 UTC m=+129.396935254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.092780 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.092825 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.092951 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.092972 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093213 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093249 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093292 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093505 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093566 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093586 4744 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093605 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093606 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093621 4744 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093685 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093707 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093728 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093746 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093764 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093780 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093796 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093812 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093827 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093845 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093868 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093884 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093901 4744 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093918 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093933 4744 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093950 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093970 4744 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.093986 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094001 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094016 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094031 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094047 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094063 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094078 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094092 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094107 4744 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094122 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094137 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094153 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094166 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094180 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094196 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094211 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094293 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094309 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094324 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094341 4744 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094361 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094376 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094391 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094405 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094420 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094434 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094448 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094467 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094486 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094501 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094534 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094551 4744 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094568 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094584 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094598 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094655 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094670 4744 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094687 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094701 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094716 4744 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094731 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094745 4744 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094763 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094779 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094849 4744 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094868 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094884 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094900 4744 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094917 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094935 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094949 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094963 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094977 4744 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094991 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095007 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095022 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095037 4744 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095052 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095067 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095089 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095103 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095118 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095133 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095148 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095162 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095178 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095192 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095208 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095223 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095238 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095256 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095271 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095285 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095300 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095315 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095056 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.094763 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.095837 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.096077 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.098293 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.099050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.099088 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.099301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.100503 4744 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.101048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.101672 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.102534 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.102558 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.102718 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.103019 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.103109 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.103205 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.103318 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.60329099 +0000 UTC m=+129.407508615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.103311 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.103346 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.103914 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.104068 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.104389 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.104749 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.105358 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.106991 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.107328 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.108333 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.108495 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.109169 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.109595 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.109696 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.117368 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.117409 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.117649 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.118257 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.118547 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.119337 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.123424 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.123471 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.124495 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.128073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.129470 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.129532 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.129661 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.129951 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.130074 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.130487 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.130507 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.130872 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131063 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131115 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131329 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131536 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131626 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131706 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131722 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.131782 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.131806 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.131823 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.131889 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.6318705 +0000 UTC m=+129.436088115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.131955 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132146 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132237 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132316 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132794 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132847 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.133007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.133085 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.133139 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.133446 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.132789 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.133632 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.134704 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.134851 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.135088 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.135372 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.136738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.137733 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.137955 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.137671 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.138473 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.145757 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.145844 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.146039 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.146503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.146552 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.147082 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.147731 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.149097 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.149152 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.149976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.150982 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.151032 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.151193 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.152241 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.157609 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.162568 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.163210 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.172290 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.174238 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.182772 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196101 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a15dc7ac-7c34-4135-b6eb-a85122800ce9-rootfs\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-bin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad8329c6-d511-446f-b617-99778d12b878-host\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196297 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrjx\" (UniqueName: \"kubernetes.io/projected/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-kube-api-access-fqrjx\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-multus\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196348 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-multus-certs\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-multus-daemon-config\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196447 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196474 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e146399d-0685-4ea8-96e4-c76c9478a23a-hosts-file\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196537 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196611 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-system-cni-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-os-release\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-os-release\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-k8s-cni-cncf-io\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-netns\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad8329c6-d511-446f-b617-99778d12b878-serviceca\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-conf-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196821 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th54b\" (UniqueName: \"kubernetes.io/projected/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-kube-api-access-th54b\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e146399d-0685-4ea8-96e4-c76c9478a23a-hosts-file\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196894 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-multus\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-multus-certs\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196981 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197008 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a15dc7ac-7c34-4135-b6eb-a85122800ce9-rootfs\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197143 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-cni-bin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197369 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad8329c6-d511-446f-b617-99778d12b878-host\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.197410 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198009 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-multus-daemon-config\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-system-cni-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198559 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-os-release\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-os-release\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198697 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-k8s-cni-cncf-io\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198742 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198742 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-conf-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198834 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-run-netns\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199064 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.196853 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199152 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cnibin\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a15dc7ac-7c34-4135-b6eb-a85122800ce9-mcd-auth-proxy-config\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.198628 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199661 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9zj\" (UniqueName: \"kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199738 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-system-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-kubelet\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.199951 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200030 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92f4c9df-1087-4820-8e07-1120f02df454-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-etc-kubernetes\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200102 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2vm\" (UniqueName: \"kubernetes.io/projected/e146399d-0685-4ea8-96e4-c76c9478a23a-kube-api-access-8h2vm\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200153 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-cnibin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqmk\" (UniqueName: \"kubernetes.io/projected/e16bf0f3-533b-4114-89c6-195a85273e98-kube-api-access-pgqmk\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200299 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-cni-binary-copy\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200352 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200375 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200412 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200485 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-socket-dir-parent\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200421 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-socket-dir-parent\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9t5q\" (UniqueName: \"kubernetes.io/projected/a15dc7ac-7c34-4135-b6eb-a85122800ce9-kube-api-access-d9t5q\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-hostroot\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.200870 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gczk\" (UniqueName: \"kubernetes.io/projected/ad8329c6-d511-446f-b617-99778d12b878-kube-api-access-7gczk\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkpvs\" (UniqueName: \"kubernetes.io/projected/92f4c9df-1087-4820-8e07-1120f02df454-kube-api-access-lkpvs\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a15dc7ac-7c34-4135-b6eb-a85122800ce9-proxy-tls\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a15dc7ac-7c34-4135-b6eb-a85122800ce9-mcd-auth-proxy-config\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201227 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201191 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201247 4744 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201265 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201279 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201294 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201312 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201312 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cnibin\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201329 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201317 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201380 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201349 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201419 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201447 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201686 4744 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201705 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201723 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201742 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201759 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201777 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201791 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201805 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201818 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201831 4744 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-multus-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201876 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201912 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201923 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201941 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201957 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201972 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.201985 4744 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202000 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202017 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202064 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202083 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202098 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202115 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202183 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202229 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.202239 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.202308 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:12.702287051 +0000 UTC m=+129.506504676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202489 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-hostroot\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202764 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-system-cni-dir\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-host-var-lib-kubelet\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.202867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204350 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204503 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:12Z","lastTransitionTime":"2026-03-11T00:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.204864 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-etc-kubernetes\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.205053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad8329c6-d511-446f-b617-99778d12b878-serviceca\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.205130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e16bf0f3-533b-4114-89c6-195a85273e98-cnibin\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.205176 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.205239 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.206036 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.207011 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.208383 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e16bf0f3-533b-4114-89c6-195a85273e98-cni-binary-copy\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.208636 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a15dc7ac-7c34-4135-b6eb-a85122800ce9-proxy-tls\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213817 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213853 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213871 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213959 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213979 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.213996 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214010 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214024 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214041 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214061 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214078 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214095 4744 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214112 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214125 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214140 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214179 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214195 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214211 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214225 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214240 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214254 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214268 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214288 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214301 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214315 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214330 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214345 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214360 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214373 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214387 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214403 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214416 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214454 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214469 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214484 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214500 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214534 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214588 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214605 4744 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214643 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214658 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214907 4744 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214925 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214939 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214952 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214966 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214979 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.214992 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215006 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215021 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215034 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215046 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215059 4744 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.215073 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.217639 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.220700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th54b\" (UniqueName: \"kubernetes.io/projected/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-kube-api-access-th54b\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.220158 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.221342 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92f4c9df-1087-4820-8e07-1120f02df454-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.223225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrjx\" (UniqueName: \"kubernetes.io/projected/ac70f681-9bcf-4f8c-a175-ed7d4e9da471-kube-api-access-fqrjx\") pod \"multus-additional-cni-plugins-sj4cl\" (UID: \"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\") " pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.223752 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.225322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.225377 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.225391 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.225410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.225441 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:12Z","lastTransitionTime":"2026-03-11T00:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.226962 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.227898 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92f4c9df-1087-4820-8e07-1120f02df454-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.228101 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkpvs\" (UniqueName: \"kubernetes.io/projected/92f4c9df-1087-4820-8e07-1120f02df454-kube-api-access-lkpvs\") pod \"ovnkube-control-plane-749d76644c-rplbw\" (UID: \"92f4c9df-1087-4820-8e07-1120f02df454\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.228172 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2vm\" (UniqueName: \"kubernetes.io/projected/e146399d-0685-4ea8-96e4-c76c9478a23a-kube-api-access-8h2vm\") pod \"node-resolver-6ghqv\" (UID: \"e146399d-0685-4ea8-96e4-c76c9478a23a\") " pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.231214 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9t5q\" (UniqueName: \"kubernetes.io/projected/a15dc7ac-7c34-4135-b6eb-a85122800ce9-kube-api-access-d9t5q\") pod \"machine-config-daemon-678nx\" (UID: \"a15dc7ac-7c34-4135-b6eb-a85122800ce9\") " pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.231327 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9zj\" (UniqueName: \"kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj\") pod \"ovnkube-node-78fcc\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.231481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gczk\" (UniqueName: \"kubernetes.io/projected/ad8329c6-d511-446f-b617-99778d12b878-kube-api-access-7gczk\") pod \"node-ca-8z8gf\" (UID: \"ad8329c6-d511-446f-b617-99778d12b878\") " pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.233202 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqmk\" (UniqueName: \"kubernetes.io/projected/e16bf0f3-533b-4114-89c6-195a85273e98-kube-api-access-pgqmk\") pod \"multus-xlclh\" (UID: \"e16bf0f3-533b-4114-89c6-195a85273e98\") " pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.236185 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.239410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.239442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.239452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.239467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.239477 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:12Z","lastTransitionTime":"2026-03-11T00:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.249045 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.252740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.252894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.253016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.253140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.253304 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:12Z","lastTransitionTime":"2026-03-11T00:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.263290 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.263935 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.267674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.267877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.267973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.268076 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.268171 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:12Z","lastTransitionTime":"2026-03-11T00:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.282543 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.282734 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.282905 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.298831 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c451dcb36d3b7ca011c072eb7e3fa51584b7ca621ac0743331864098a4bb922c WatchSource:0}: Error finding container c451dcb36d3b7ca011c072eb7e3fa51584b7ca621ac0743331864098a4bb922c: Status 404 returned error can't find the container with id c451dcb36d3b7ca011c072eb7e3fa51584b7ca621ac0743331864098a4bb922c Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.305854 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.316101 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8z8gf" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.332892 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.340767 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-58ce3507c2ed4bc2d163ae390e6ed52154ab750f1aa0a23bd9383e6ccb0e7957 WatchSource:0}: Error finding container 58ce3507c2ed4bc2d163ae390e6ed52154ab750f1aa0a23bd9383e6ccb0e7957: Status 404 returned error can't find the container with id 58ce3507c2ed4bc2d163ae390e6ed52154ab750f1aa0a23bd9383e6ccb0e7957 Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.350980 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xlclh" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.355070 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8329c6_d511_446f_b617_99778d12b878.slice/crio-67e64c4d338ae3b42d71901dae8112f1417c969f1716e22ff07556cd67179fa9 WatchSource:0}: Error finding container 67e64c4d338ae3b42d71901dae8112f1417c969f1716e22ff07556cd67179fa9: Status 404 returned error can't find the container with id 67e64c4d338ae3b42d71901dae8112f1417c969f1716e22ff07556cd67179fa9 Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.363506 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6ghqv" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.368381 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f4c9df_1087_4820_8e07_1120f02df454.slice/crio-c11ba8a873985aac059204f12aa24f0cc4d28dfb1feba5f31b1368053453ce66 WatchSource:0}: Error finding container c11ba8a873985aac059204f12aa24f0cc4d28dfb1feba5f31b1368053453ce66: Status 404 returned error can't find the container with id c11ba8a873985aac059204f12aa24f0cc4d28dfb1feba5f31b1368053453ce66 Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.387561 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.398908 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode16bf0f3_533b_4114_89c6_195a85273e98.slice/crio-700a12a9423d65e01a9e98e0cb4e159c02a82cd8bd935b12e8af9dbfa1dcf455 WatchSource:0}: Error finding container 700a12a9423d65e01a9e98e0cb4e159c02a82cd8bd935b12e8af9dbfa1dcf455: Status 404 returned error can't find the container with id 700a12a9423d65e01a9e98e0cb4e159c02a82cd8bd935b12e8af9dbfa1dcf455 Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.408170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.409623 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode146399d_0685_4ea8_96e4_c76c9478a23a.slice/crio-9c66091fe2cd5c2ee60e8437c4be10cad7ba12fe0a028c431ceec68150ae7438 WatchSource:0}: Error finding container 9c66091fe2cd5c2ee60e8437c4be10cad7ba12fe0a028c431ceec68150ae7438: Status 404 returned error can't find the container with id 9c66091fe2cd5c2ee60e8437c4be10cad7ba12fe0a028c431ceec68150ae7438 Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.426924 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac70f681_9bcf_4f8c_a175_ed7d4e9da471.slice/crio-3eb548f5798d345f227c96a2ea1e198fc0a36e888f72448fd625a2bf7f538d8a WatchSource:0}: Error finding container 3eb548f5798d345f227c96a2ea1e198fc0a36e888f72448fd625a2bf7f538d8a: Status 404 returned error can't find the container with id 3eb548f5798d345f227c96a2ea1e198fc0a36e888f72448fd625a2bf7f538d8a Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.429709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.444247 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15dc7ac_7c34_4135_b6eb_a85122800ce9.slice/crio-39a01df1760f04566efde2993a37dde660d1a5ea422c3ebfde5f323d0944c527 WatchSource:0}: Error finding container 39a01df1760f04566efde2993a37dde660d1a5ea422c3ebfde5f323d0944c527: Status 404 returned error can't find the container with id 39a01df1760f04566efde2993a37dde660d1a5ea422c3ebfde5f323d0944c527 Mar 11 00:56:12 crc kubenswrapper[4744]: W0311 00:56:12.473235 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ff04e11_e747_44c5_b049_371a5d422157.slice/crio-149bb7f8bc16884920b6e48e26185cd3ea4656a535d32aea578edd5e1ba9b507 WatchSource:0}: Error finding container 149bb7f8bc16884920b6e48e26185cd3ea4656a535d32aea578edd5e1ba9b507: Status 404 returned error can't find the container with id 149bb7f8bc16884920b6e48e26185cd3ea4656a535d32aea578edd5e1ba9b507 Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.573833 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6ghqv" event={"ID":"e146399d-0685-4ea8-96e4-c76c9478a23a","Type":"ContainerStarted","Data":"9c66091fe2cd5c2ee60e8437c4be10cad7ba12fe0a028c431ceec68150ae7438"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.575451 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" event={"ID":"92f4c9df-1087-4820-8e07-1120f02df454","Type":"ContainerStarted","Data":"c11ba8a873985aac059204f12aa24f0cc4d28dfb1feba5f31b1368053453ce66"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.576502 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8z8gf" event={"ID":"ad8329c6-d511-446f-b617-99778d12b878","Type":"ContainerStarted","Data":"67e64c4d338ae3b42d71901dae8112f1417c969f1716e22ff07556cd67179fa9"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.579747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"58ce3507c2ed4bc2d163ae390e6ed52154ab750f1aa0a23bd9383e6ccb0e7957"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.584575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"39a01df1760f04566efde2993a37dde660d1a5ea422c3ebfde5f323d0944c527"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.590061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"149bb7f8bc16884920b6e48e26185cd3ea4656a535d32aea578edd5e1ba9b507"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.594903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerStarted","Data":"700a12a9423d65e01a9e98e0cb4e159c02a82cd8bd935b12e8af9dbfa1dcf455"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.597805 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"70781f61d16d3300118eac0bbe8c1d6774c8773f721d5c5e4e83eb7267319bf2"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.601361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.601397 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c451dcb36d3b7ca011c072eb7e3fa51584b7ca621ac0743331864098a4bb922c"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.602563 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerStarted","Data":"3eb548f5798d345f227c96a2ea1e198fc0a36e888f72448fd625a2bf7f538d8a"} Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.602762 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.611079 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.618639 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.618768 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.618747328 +0000 UTC m=+130.422964933 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.618799 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.618855 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.619096 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619164 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619190 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619206 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619048 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.619249 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619281 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.619259213 +0000 UTC m=+130.423476818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619321 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.619294724 +0000 UTC m=+130.423512319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619328 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.619357 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.619349286 +0000 UTC m=+130.423566891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.625830 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.634007 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.648889 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.661997 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.675268 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.680865 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.693336 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.715152 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.719924 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.720004 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720136 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720157 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720170 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720212 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.720197231 +0000 UTC m=+130.524414836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720419 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.720590 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:13.720562432 +0000 UTC m=+130.524780037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.732946 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.747441 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.758022 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.767740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.777658 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.787999 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.797229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.804316 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.816909 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.832721 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.843197 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.855381 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.866165 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.877595 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.887503 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.900818 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.911370 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.923257 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.937241 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 00:56:12 crc kubenswrapper[4744]: I0311 00:56:12.973748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:12 crc kubenswrapper[4744]: E0311 00:56:12.973972 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.607977 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" event={"ID":"92f4c9df-1087-4820-8e07-1120f02df454","Type":"ContainerStarted","Data":"2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.608040 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" event={"ID":"92f4c9df-1087-4820-8e07-1120f02df454","Type":"ContainerStarted","Data":"34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.611678 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" exitCode=0 Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.611731 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.613589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerStarted","Data":"3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.615454 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.617606 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.617663 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.619744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.621634 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b" exitCode=0 Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.621687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.623285 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6ghqv" event={"ID":"e146399d-0685-4ea8-96e4-c76c9478a23a","Type":"ContainerStarted","Data":"9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.624653 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8z8gf" event={"ID":"ad8329c6-d511-446f-b617-99778d12b878","Type":"ContainerStarted","Data":"27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde"} Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.629506 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.629639 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.629613804 +0000 UTC m=+132.433831409 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.629710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.629749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.629793 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.629898 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.629940 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.629949 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.629940324 +0000 UTC m=+132.434157929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.630000 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.630031 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.630008816 +0000 UTC m=+132.434226431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.630041 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.630059 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.630139 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.630115589 +0000 UTC m=+132.434333214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.630641 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.652272 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.666316 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.683735 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.706623 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.722965 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.731316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.732590 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.732636 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.732657 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.732727 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.732701756 +0000 UTC m=+132.536919561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.733285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.733506 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.733620 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:15.733593982 +0000 UTC m=+132.537811607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.737164 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.750710 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.765110 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.777314 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.789257 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.801733 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.811638 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.822130 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.835431 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.849306 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.863712 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.876882 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.889798 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.902653 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.915760 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.927150 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.946682 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.974524 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.974437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.974577 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.974879 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.974899 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.975088 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:13 crc kubenswrapper[4744]: E0311 00:56:13.975255 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.980025 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.981471 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.983903 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.985243 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.987289 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.988494 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.989803 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.992548 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.993050 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.993937 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 11 00:56:13 crc kubenswrapper[4744]: I0311 00:56:13.996592 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.001071 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.002861 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.005033 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.006315 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.007239 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.007582 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.008961 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.010317 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.011321 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.012020 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.012701 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.013714 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.014347 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.015311 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.016116 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.016692 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.017825 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.018541 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.019437 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.020181 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.021105 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.021685 4744 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.021860 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.024189 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.024675 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.025253 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.025920 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.027655 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.028858 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.029449 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.030541 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.031253 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.032198 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.032863 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.033921 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.034997 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.035533 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.036496 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.037118 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.038237 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.038959 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.039917 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.040934 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.041816 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.042329 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.043290 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.044950 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.069198 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: E0311 00:56:14.073919 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.086939 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.108703 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.128943 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.147335 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.165414 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.186233 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.221616 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.263797 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.300036 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.337027 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.377562 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.469816 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.504828 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.533725 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.560634 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.580350 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.629714 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.629759 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.631640 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerStarted","Data":"9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081"} Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.663245 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.679904 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.693414 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.736349 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.781860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.817129 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.856161 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.901245 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.938412 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.974223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:14 crc kubenswrapper[4744]: E0311 00:56:14.974593 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:14 crc kubenswrapper[4744]: I0311 00:56:14.978561 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.024809 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.055079 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.096925 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.140154 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.173812 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.637914 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081" exitCode=0 Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.638024 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081"} Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.644280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.644330 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.644344 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.644358 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.656211 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.660232 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.660405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.660542 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.660576 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.660487476 +0000 UTC m=+136.464705121 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.660631 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.660614339 +0000 UTC m=+136.464831984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.660735 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.660816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.660987 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.661026 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.661056 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.661037131 +0000 UTC m=+136.465254746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.661060 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.661088 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.661169 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.661154675 +0000 UTC m=+136.465372310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.675555 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.698860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.719397 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.750874 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.762582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.762757 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.762803 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.762853 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.762880 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.762980 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.762947069 +0000 UTC m=+136.567164714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.763201 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.763351 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:19.76331593 +0000 UTC m=+136.567533575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.768721 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.791794 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.812110 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.826907 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.837373 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.851666 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.869351 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.880601 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.892362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.903298 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:15Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.974279 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.974462 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.974663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.974832 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:15 crc kubenswrapper[4744]: I0311 00:56:15.974910 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:15 crc kubenswrapper[4744]: E0311 00:56:15.975010 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.653550 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db" exitCode=0 Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.653666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db"} Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.656081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0"} Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.678961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.700176 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.725391 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.744954 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.766133 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.786591 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.802495 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.820136 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.839766 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.873738 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.894583 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.919859 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.952290 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.975113 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:16 crc kubenswrapper[4744]: E0311 00:56:16.975505 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.976625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:16 crc kubenswrapper[4744]: I0311 00:56:16.997679 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:16Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.019216 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.039564 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.060152 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.082614 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.102622 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.121814 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.139638 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.161806 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.182128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.202796 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.223569 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.241185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.265156 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.297242 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.327167 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.668780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.674167 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87" exitCode=0 Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.674279 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87"} Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.692006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.715222 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.736031 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.755491 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.771057 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.792794 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.814202 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.830573 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.845911 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.863902 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.891132 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.907989 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.919880 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.932628 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.949644 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:17Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.973796 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.973875 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:17 crc kubenswrapper[4744]: E0311 00:56:17.973930 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:17 crc kubenswrapper[4744]: E0311 00:56:17.974163 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:17 crc kubenswrapper[4744]: I0311 00:56:17.974254 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:17 crc kubenswrapper[4744]: E0311 00:56:17.974383 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.687333 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23" exitCode=0 Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.687486 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23"} Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.717135 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.757490 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.772884 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.791432 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.810176 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.821878 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.836906 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.846798 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.861261 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.880736 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.893943 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.907039 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.920752 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.937948 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.957941 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:18Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.974837 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:18 crc kubenswrapper[4744]: E0311 00:56:18.975065 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.991072 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:18 crc kubenswrapper[4744]: E0311 00:56:18.991323 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:18 crc kubenswrapper[4744]: I0311 00:56:18.991942 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.075452 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.697404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56"} Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.697705 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.697747 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.698021 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.703777 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac70f681-9bcf-4f8c-a175-ed7d4e9da471" containerID="81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b" exitCode=0 Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.703881 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerDied","Data":"81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b"} Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.704871 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.705229 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.708059 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708174 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.708148998 +0000 UTC m=+144.512366623 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.708211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.708258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.708306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708387 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708423 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.708413745 +0000 UTC m=+144.512631360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708507 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708563 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708581 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708615 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.708603961 +0000 UTC m=+144.512821576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708673 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.708699 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.708692314 +0000 UTC m=+144.512909929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.729677 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.739666 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.739756 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.749138 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.765785 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.778373 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.796728 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.810847 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.811172 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.811741 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.811818 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.811794221 +0000 UTC m=+144.616011836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.814866 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.814916 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.814936 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.814998 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:27.814980775 +0000 UTC m=+144.619198500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.816494 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.832239 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.874963 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.897798 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.917526 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.933315 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.944496 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.961486 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.974639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.974766 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.974957 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.974987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.975832 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.976682 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:19 crc kubenswrapper[4744]: E0311 00:56:19.977135 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:19 crc kubenswrapper[4744]: I0311 00:56:19.995529 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:19Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.007577 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.021603 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.033624 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.048405 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.062135 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.087457 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.122642 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.140755 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.167044 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.183099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.203422 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.215282 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.232170 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.245149 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.259304 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.272791 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.291740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.716382 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" event={"ID":"ac70f681-9bcf-4f8c-a175-ed7d4e9da471","Type":"ContainerStarted","Data":"fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896"} Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.739000 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.759931 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.783279 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.801198 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.824447 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.857614 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.877899 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.893466 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.907372 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.921666 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.934093 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.948592 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.965397 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.973995 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:20 crc kubenswrapper[4744]: E0311 00:56:20.974127 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.984403 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:20 crc kubenswrapper[4744]: I0311 00:56:20.998401 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:20Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:21 crc kubenswrapper[4744]: I0311 00:56:21.013397 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:21Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:21 crc kubenswrapper[4744]: I0311 00:56:21.979037 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:21 crc kubenswrapper[4744]: E0311 00:56:21.979237 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:21 crc kubenswrapper[4744]: I0311 00:56:21.979936 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:21 crc kubenswrapper[4744]: E0311 00:56:21.980037 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:21 crc kubenswrapper[4744]: I0311 00:56:21.980113 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:21 crc kubenswrapper[4744]: E0311 00:56:21.980199 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.566315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.566380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.566398 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.566424 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.566443 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:22Z","lastTransitionTime":"2026-03-11T00:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.586288 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.592069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.592148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.592168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.592199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.592217 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:22Z","lastTransitionTime":"2026-03-11T00:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.612069 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.618162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.618258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.618285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.618322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.618362 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:22Z","lastTransitionTime":"2026-03-11T00:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.638658 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.644522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.644597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.644618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.644647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.644664 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:22Z","lastTransitionTime":"2026-03-11T00:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.663499 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.668434 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.668471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.668505 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.668555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.668572 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:22Z","lastTransitionTime":"2026-03-11T00:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.687739 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.688004 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.770434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/0.log" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.775710 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56" exitCode=1 Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.775778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56"} Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.777628 4744 scope.go:117] "RemoveContainer" containerID="aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.797699 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.820362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.842026 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.871318 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.894696 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.910565 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.933770 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.961488 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:56:22.252931 6736 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:22.253017 6736 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 00:56:22.253083 6736 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:22.253095 6736 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:22.253101 6736 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:22.253122 6736 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 00:56:22.253135 6736 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 00:56:22.253140 6736 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 00:56:22.253145 6736 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 00:56:22.253157 6736 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:22.253172 6736 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 00:56:22.253192 6736 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:22.253215 6736 factory.go:656] Stopping watch factory\\\\nI0311 00:56:22.253216 6736 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 00:56:22.253236 6736 ovnkube.go:599] Stopped ovnkube\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.974660 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:22 crc kubenswrapper[4744]: E0311 00:56:22.975400 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.986250 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:22Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:22 crc kubenswrapper[4744]: I0311 00:56:22.993334 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.008939 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.031081 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.053121 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.071685 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.093079 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.109289 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.123718 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.784108 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/0.log" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.789345 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039"} Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.790363 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.872958 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.903818 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.920504 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.937075 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.952667 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.968082 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.973765 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.973985 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.974121 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:23 crc kubenswrapper[4744]: E0311 00:56:23.974048 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:23 crc kubenswrapper[4744]: E0311 00:56:23.974255 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:23 crc kubenswrapper[4744]: E0311 00:56:23.974340 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:23 crc kubenswrapper[4744]: I0311 00:56:23.989774 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:23Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.007893 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.024717 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.045575 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.068618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: E0311 00:56:24.077399 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.089918 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.114356 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:56:22.252931 6736 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:22.253017 6736 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 00:56:22.253083 6736 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:22.253095 6736 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:22.253101 6736 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:22.253122 6736 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 00:56:22.253135 6736 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 00:56:22.253140 6736 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 00:56:22.253145 6736 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 00:56:22.253157 6736 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:22.253172 6736 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 00:56:22.253192 6736 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:22.253215 6736 factory.go:656] Stopping watch factory\\\\nI0311 00:56:22.253216 6736 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 00:56:22.253236 6736 ovnkube.go:599] Stopped ovnkube\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.135148 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.149712 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.161909 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.181495 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.205477 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.224784 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.243938 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.258385 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.272698 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.293091 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.309991 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.327355 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.344742 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.360437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.372370 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.385230 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.398170 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.415656 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.425490 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.440949 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.469284 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:56:22.252931 6736 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:22.253017 6736 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 00:56:22.253083 6736 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:22.253095 6736 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:22.253101 6736 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:22.253122 6736 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 00:56:22.253135 6736 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 00:56:22.253140 6736 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 00:56:22.253145 6736 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 00:56:22.253157 6736 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:22.253172 6736 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 00:56:22.253192 6736 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:22.253215 6736 factory.go:656] Stopping watch factory\\\\nI0311 00:56:22.253216 6736 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 00:56:22.253236 6736 ovnkube.go:599] Stopped ovnkube\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.794142 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/1.log" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.795111 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/0.log" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.797832 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039" exitCode=1 Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.797917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039"} Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.798016 4744 scope.go:117] "RemoveContainer" containerID="aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.800291 4744 scope.go:117] "RemoveContainer" containerID="9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039" Mar 11 00:56:24 crc kubenswrapper[4744]: E0311 00:56:24.800794 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.819944 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.838385 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.861871 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.913926 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4304285d6a4bb306a291f5a152fa229d42c20a3bc7abb0fd4137bc3f16ee56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:22Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:56:22.252931 6736 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:22.253017 6736 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 00:56:22.253083 6736 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:22.253095 6736 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:22.253101 6736 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:22.253122 6736 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 00:56:22.253135 6736 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 00:56:22.253140 6736 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 00:56:22.253145 6736 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 00:56:22.253157 6736 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:22.253172 6736 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 00:56:22.253192 6736 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:22.253215 6736 factory.go:656] Stopping watch factory\\\\nI0311 00:56:22.253216 6736 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 00:56:22.253236 6736 ovnkube.go:599] Stopped ovnkube\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:24Z\\\",\\\"message\\\":\\\":24.074233 6873 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:56:24.074426 6873 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074740 6873 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074745 6873 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:24.074779 6873 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:24.074787 6873 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:24.074812 6873 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:24.074847 6873 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:24.074862 6873 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:24.074879 6873 factory.go:656] Stopping watch factory\\\\nI0311 00:56:24.074895 6873 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:56:24.074907 6873 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.939423 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.957508 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.970792 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.973736 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:24 crc kubenswrapper[4744]: E0311 00:56:24.973884 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.985795 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:24 crc kubenswrapper[4744]: I0311 00:56:24.998498 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:24Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.011362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.024310 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.045503 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.061986 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.080260 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.099217 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.117895 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.136200 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.804668 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/1.log" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.810732 4744 scope.go:117] "RemoveContainer" containerID="9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039" Mar 11 00:56:25 crc kubenswrapper[4744]: E0311 00:56:25.811066 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.832587 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.850983 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.875187 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.914137 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:24Z\\\",\\\"message\\\":\\\":24.074233 6873 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:56:24.074426 6873 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074740 6873 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074745 6873 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:24.074779 6873 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:24.074787 6873 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:24.074812 6873 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:24.074847 6873 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:24.074862 6873 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:24.074879 6873 factory.go:656] Stopping watch factory\\\\nI0311 00:56:24.074895 6873 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:56:24.074907 6873 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.939150 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.960770 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.973838 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.973839 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:25 crc kubenswrapper[4744]: E0311 00:56:25.974392 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:25 crc kubenswrapper[4744]: E0311 00:56:25.974407 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.973900 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:25 crc kubenswrapper[4744]: E0311 00:56:25.975143 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:25 crc kubenswrapper[4744]: I0311 00:56:25.979437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.002618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:25Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.022673 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.042212 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.058359 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.077307 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.092944 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.111373 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.134634 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.158475 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.181213 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:26Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:26 crc kubenswrapper[4744]: I0311 00:56:26.974965 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:26 crc kubenswrapper[4744]: E0311 00:56:26.975766 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.727940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.728142 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728272 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.728226709 +0000 UTC m=+160.532444344 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728356 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728389 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728411 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.728410 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728554 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.728476067 +0000 UTC m=+160.532693722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.728640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728651 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728718 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.728703213 +0000 UTC m=+160.532920848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728767 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.728817 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.728803746 +0000 UTC m=+160.533021391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.829481 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.829802 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.830217 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.830175443 +0000 UTC m=+160.634393078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.830065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.830691 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.830947 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.831111 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.831398 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:56:43.831361478 +0000 UTC m=+160.635579113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.974540 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.974646 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:27 crc kubenswrapper[4744]: I0311 00:56:27.974668 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.975270 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.975354 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:27 crc kubenswrapper[4744]: E0311 00:56:27.975562 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:28 crc kubenswrapper[4744]: I0311 00:56:28.973761 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:28 crc kubenswrapper[4744]: E0311 00:56:28.974009 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:29 crc kubenswrapper[4744]: E0311 00:56:29.079564 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:29 crc kubenswrapper[4744]: I0311 00:56:29.973779 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:29 crc kubenswrapper[4744]: I0311 00:56:29.973918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:29 crc kubenswrapper[4744]: E0311 00:56:29.973967 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:29 crc kubenswrapper[4744]: I0311 00:56:29.973918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:29 crc kubenswrapper[4744]: E0311 00:56:29.974102 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:29 crc kubenswrapper[4744]: E0311 00:56:29.974286 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:30 crc kubenswrapper[4744]: I0311 00:56:30.974748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:30 crc kubenswrapper[4744]: E0311 00:56:30.975208 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:30 crc kubenswrapper[4744]: I0311 00:56:30.989235 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 11 00:56:31 crc kubenswrapper[4744]: I0311 00:56:31.974183 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:31 crc kubenswrapper[4744]: I0311 00:56:31.974260 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:31 crc kubenswrapper[4744]: I0311 00:56:31.974183 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:31 crc kubenswrapper[4744]: E0311 00:56:31.974399 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:31 crc kubenswrapper[4744]: E0311 00:56:31.974574 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:31 crc kubenswrapper[4744]: E0311 00:56:31.974740 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.974576 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:32 crc kubenswrapper[4744]: E0311 00:56:32.974969 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.975150 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:32 crc kubenswrapper[4744]: E0311 00:56:32.975334 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.989241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.989273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.989284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.989298 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:32 crc kubenswrapper[4744]: I0311 00:56:32.989311 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:32Z","lastTransitionTime":"2026-03-11T00:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.009938 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.015569 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.015640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.015664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.015694 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.015718 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:33Z","lastTransitionTime":"2026-03-11T00:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.034660 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.040485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.040567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.040584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.040608 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.040627 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:33Z","lastTransitionTime":"2026-03-11T00:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.060388 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.067944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.068233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.068400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.068623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.068787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:33Z","lastTransitionTime":"2026-03-11T00:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.092198 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.098419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.098488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.098507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.098574 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.098615 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:33Z","lastTransitionTime":"2026-03-11T00:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.118877 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.119156 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.974747 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.974816 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.974957 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.975101 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.975216 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:33 crc kubenswrapper[4744]: E0311 00:56:33.975297 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:33 crc kubenswrapper[4744]: I0311 00:56:33.995036 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:33Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.019688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.052041 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:24Z\\\",\\\"message\\\":\\\":24.074233 6873 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:56:24.074426 6873 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074740 6873 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074745 6873 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:24.074779 6873 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:24.074787 6873 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:24.074812 6873 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:24.074847 6873 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:24.074862 6873 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:24.074879 6873 factory.go:656] Stopping watch factory\\\\nI0311 00:56:24.074895 6873 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:56:24.074907 6873 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.074918 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: E0311 00:56:34.080620 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.097092 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.118494 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.140681 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.159541 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.180923 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.197068 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.211720 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.226783 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.241861 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.258645 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.276903 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.293954 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.310474 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.328706 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:34Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:34 crc kubenswrapper[4744]: I0311 00:56:34.974801 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:34 crc kubenswrapper[4744]: E0311 00:56:34.975019 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:35 crc kubenswrapper[4744]: I0311 00:56:35.977702 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:35 crc kubenswrapper[4744]: I0311 00:56:35.977732 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:35 crc kubenswrapper[4744]: E0311 00:56:35.978956 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:35 crc kubenswrapper[4744]: E0311 00:56:35.979038 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:35 crc kubenswrapper[4744]: I0311 00:56:35.977807 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:35 crc kubenswrapper[4744]: E0311 00:56:35.979163 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:36 crc kubenswrapper[4744]: I0311 00:56:36.973768 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:36 crc kubenswrapper[4744]: E0311 00:56:36.974669 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:36 crc kubenswrapper[4744]: I0311 00:56:36.975125 4744 scope.go:117] "RemoveContainer" containerID="9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.862592 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/1.log" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.866842 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc"} Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.867590 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.893237 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:37Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.911128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:37Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.933116 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:37Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.954129 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:37Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.974076 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.974169 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.974229 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:37 crc kubenswrapper[4744]: E0311 00:56:37.974285 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:37 crc kubenswrapper[4744]: E0311 00:56:37.974499 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:37 crc kubenswrapper[4744]: E0311 00:56:37.974737 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:37 crc kubenswrapper[4744]: I0311 00:56:37.978234 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:37Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.011132 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.028125 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.047584 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.067669 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.087646 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.109650 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.128717 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.154097 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.187601 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:24Z\\\",\\\"message\\\":\\\":24.074233 6873 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:56:24.074426 6873 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074740 6873 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074745 6873 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:24.074779 6873 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:24.074787 6873 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:24.074812 6873 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:24.074847 6873 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:24.074862 6873 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:24.074879 6873 factory.go:656] Stopping watch factory\\\\nI0311 00:56:24.074895 6873 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:56:24.074907 6873 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.203787 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.217345 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.230612 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.243395 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.875397 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/2.log" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.876470 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/1.log" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.880922 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" exitCode=1 Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.881001 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc"} Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.881074 4744 scope.go:117] "RemoveContainer" containerID="9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.882625 4744 scope.go:117] "RemoveContainer" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" Mar 11 00:56:38 crc kubenswrapper[4744]: E0311 00:56:38.883256 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.901211 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.921466 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.940394 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.959491 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.974382 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:38 crc kubenswrapper[4744]: E0311 00:56:38.974600 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.977760 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:38 crc kubenswrapper[4744]: I0311 00:56:38.998878 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:38Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.017479 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.036778 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.056829 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.072770 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: E0311 00:56:39.082788 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.092465 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.113803 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.137428 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.157430 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.177289 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.192375 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.227679 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.257400 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1996e9bc928fb639a65a4dab19a19d9661eabdf82ddd23a0dfff6fba30a039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:24Z\\\",\\\"message\\\":\\\":24.074233 6873 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:56:24.074426 6873 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074740 6873 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:56:24.074745 6873 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:56:24.074779 6873 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 00:56:24.074787 6873 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 00:56:24.074812 6873 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:56:24.074847 6873 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 00:56:24.074862 6873 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 00:56:24.074879 6873 factory.go:656] Stopping watch factory\\\\nI0311 00:56:24.074895 6873 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:56:24.074907 6873 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.889595 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/2.log" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.896114 4744 scope.go:117] "RemoveContainer" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" Mar 11 00:56:39 crc kubenswrapper[4744]: E0311 00:56:39.896397 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.921474 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.939784 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.958337 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.974826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.974868 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.974827 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:39 crc kubenswrapper[4744]: E0311 00:56:39.975023 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:39 crc kubenswrapper[4744]: E0311 00:56:39.975114 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:39 crc kubenswrapper[4744]: E0311 00:56:39.975328 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.975910 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:39 crc kubenswrapper[4744]: I0311 00:56:39.996618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:39Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.013096 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.032230 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.049899 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.066832 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.084453 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.099811 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.117630 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.138743 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.157182 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.177594 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.193040 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.215684 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.246364 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:40Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:40 crc kubenswrapper[4744]: I0311 00:56:40.974320 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:40 crc kubenswrapper[4744]: E0311 00:56:40.974662 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:41 crc kubenswrapper[4744]: I0311 00:56:41.974493 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:41 crc kubenswrapper[4744]: I0311 00:56:41.974577 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:41 crc kubenswrapper[4744]: I0311 00:56:41.974629 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:41 crc kubenswrapper[4744]: E0311 00:56:41.974818 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:41 crc kubenswrapper[4744]: E0311 00:56:41.974958 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:41 crc kubenswrapper[4744]: E0311 00:56:41.975116 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:42 crc kubenswrapper[4744]: I0311 00:56:42.974578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:42 crc kubenswrapper[4744]: E0311 00:56:42.974821 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.386856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.387333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.387351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.387379 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.387399 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:43Z","lastTransitionTime":"2026-03-11T00:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.407558 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:43Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.413482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.413569 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.413590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.413615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.413633 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:43Z","lastTransitionTime":"2026-03-11T00:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.433859 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:43Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.441307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.441370 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.441392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.441427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.441459 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:43Z","lastTransitionTime":"2026-03-11T00:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.463653 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:43Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.469176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.469281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.469305 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.469333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.469351 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:43Z","lastTransitionTime":"2026-03-11T00:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.488792 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:43Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.493615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.493677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.493701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.493737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.493760 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:43Z","lastTransitionTime":"2026-03-11T00:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.514597 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:43Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.514868 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.752751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.752967 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753021 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.752971431 +0000 UTC m=+192.557189076 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753147 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.753194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753235 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.753210158 +0000 UTC m=+192.557427803 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.753334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753428 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753604 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.753568929 +0000 UTC m=+192.557786544 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753612 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753640 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753663 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.753729 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.753712193 +0000 UTC m=+192.557929838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.854899 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.854981 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855174 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855262 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.855240614 +0000 UTC m=+192.659458229 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855324 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855418 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855445 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.855617 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:57:15.855575694 +0000 UTC m=+192.659793449 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.974100 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.974240 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.974360 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.974489 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:43 crc kubenswrapper[4744]: I0311 00:56:43.974682 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:43 crc kubenswrapper[4744]: E0311 00:56:43.974860 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.011241 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.045080 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.065722 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: E0311 00:56:44.084168 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.088846 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.109177 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.124489 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.148350 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.166457 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.186051 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.204034 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.223261 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.237880 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.255973 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.277567 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.295706 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.316088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.338650 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.360537 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:44Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:44 crc kubenswrapper[4744]: I0311 00:56:44.974060 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:44 crc kubenswrapper[4744]: E0311 00:56:44.974266 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:45 crc kubenswrapper[4744]: I0311 00:56:45.974442 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:45 crc kubenswrapper[4744]: I0311 00:56:45.974627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:45 crc kubenswrapper[4744]: I0311 00:56:45.974794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:45 crc kubenswrapper[4744]: E0311 00:56:45.974988 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:45 crc kubenswrapper[4744]: E0311 00:56:45.975383 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:45 crc kubenswrapper[4744]: I0311 00:56:45.975496 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:45 crc kubenswrapper[4744]: E0311 00:56:45.976059 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:45 crc kubenswrapper[4744]: E0311 00:56:45.976172 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:46 crc kubenswrapper[4744]: I0311 00:56:46.974573 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:46 crc kubenswrapper[4744]: E0311 00:56:46.974840 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:47 crc kubenswrapper[4744]: I0311 00:56:47.974446 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:47 crc kubenswrapper[4744]: I0311 00:56:47.974580 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:47 crc kubenswrapper[4744]: I0311 00:56:47.974594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:47 crc kubenswrapper[4744]: E0311 00:56:47.974740 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:47 crc kubenswrapper[4744]: E0311 00:56:47.974904 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:47 crc kubenswrapper[4744]: E0311 00:56:47.975111 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:48 crc kubenswrapper[4744]: I0311 00:56:48.974284 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:48 crc kubenswrapper[4744]: E0311 00:56:48.974438 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:49 crc kubenswrapper[4744]: E0311 00:56:49.086115 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:49 crc kubenswrapper[4744]: I0311 00:56:49.974335 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:49 crc kubenswrapper[4744]: I0311 00:56:49.974393 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:49 crc kubenswrapper[4744]: I0311 00:56:49.974393 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:49 crc kubenswrapper[4744]: E0311 00:56:49.974659 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:49 crc kubenswrapper[4744]: E0311 00:56:49.974831 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:49 crc kubenswrapper[4744]: E0311 00:56:49.975086 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:50 crc kubenswrapper[4744]: I0311 00:56:50.974742 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:50 crc kubenswrapper[4744]: E0311 00:56:50.974942 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:50 crc kubenswrapper[4744]: I0311 00:56:50.976352 4744 scope.go:117] "RemoveContainer" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" Mar 11 00:56:50 crc kubenswrapper[4744]: E0311 00:56:50.976640 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:56:51 crc kubenswrapper[4744]: I0311 00:56:51.974752 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:51 crc kubenswrapper[4744]: I0311 00:56:51.974815 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:51 crc kubenswrapper[4744]: E0311 00:56:51.974945 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:51 crc kubenswrapper[4744]: I0311 00:56:51.974853 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:51 crc kubenswrapper[4744]: E0311 00:56:51.975029 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:51 crc kubenswrapper[4744]: E0311 00:56:51.975151 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:52 crc kubenswrapper[4744]: I0311 00:56:52.973804 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:52 crc kubenswrapper[4744]: E0311 00:56:52.974012 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.538573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.538642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.538661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.538686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.538704 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:53Z","lastTransitionTime":"2026-03-11T00:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.559093 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.564800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.564852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.564872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.564893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.564910 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:53Z","lastTransitionTime":"2026-03-11T00:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.582958 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.589360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.589445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.589466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.589496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.589552 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:53Z","lastTransitionTime":"2026-03-11T00:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.610802 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.616417 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.616477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.616489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.616531 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.616547 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:53Z","lastTransitionTime":"2026-03-11T00:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.634774 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.641080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.641164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.641183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.641209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.641227 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:56:53Z","lastTransitionTime":"2026-03-11T00:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.663424 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.663681 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.974836 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.974916 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.974993 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.975260 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.975439 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:53 crc kubenswrapper[4744]: E0311 00:56:53.975577 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:53 crc kubenswrapper[4744]: I0311 00:56:53.996107 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:53Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.019099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.038328 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.059053 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.087420 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: E0311 00:56:54.088191 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.103919 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.124037 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.144941 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.160243 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.181333 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.212304 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.238770 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.259123 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.276483 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.300127 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.317916 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.340155 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.361769 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:56:54Z is after 2025-08-24T17:21:41Z" Mar 11 00:56:54 crc kubenswrapper[4744]: I0311 00:56:54.973931 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:54 crc kubenswrapper[4744]: E0311 00:56:54.974179 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:55 crc kubenswrapper[4744]: I0311 00:56:55.973961 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:55 crc kubenswrapper[4744]: I0311 00:56:55.974139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:55 crc kubenswrapper[4744]: E0311 00:56:55.974191 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:55 crc kubenswrapper[4744]: I0311 00:56:55.974257 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:55 crc kubenswrapper[4744]: E0311 00:56:55.974415 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:55 crc kubenswrapper[4744]: E0311 00:56:55.974623 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:56 crc kubenswrapper[4744]: I0311 00:56:56.974733 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:56 crc kubenswrapper[4744]: E0311 00:56:56.974936 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:56 crc kubenswrapper[4744]: I0311 00:56:56.975686 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:56:56 crc kubenswrapper[4744]: E0311 00:56:56.975938 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:56:57 crc kubenswrapper[4744]: I0311 00:56:57.974070 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:57 crc kubenswrapper[4744]: I0311 00:56:57.974155 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:57 crc kubenswrapper[4744]: E0311 00:56:57.974276 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:57 crc kubenswrapper[4744]: E0311 00:56:57.974503 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:57 crc kubenswrapper[4744]: I0311 00:56:57.974090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:57 crc kubenswrapper[4744]: E0311 00:56:57.974693 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:58 crc kubenswrapper[4744]: I0311 00:56:58.973916 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:56:58 crc kubenswrapper[4744]: E0311 00:56:58.974149 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:56:59 crc kubenswrapper[4744]: E0311 00:56:59.089650 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.974139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:56:59 crc kubenswrapper[4744]: E0311 00:56:59.974846 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.974277 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:56:59 crc kubenswrapper[4744]: E0311 00:56:59.975422 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.974139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:56:59 crc kubenswrapper[4744]: E0311 00:56:59.975967 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.984032 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/0.log" Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.984132 4744 generic.go:334] "Generic (PLEG): container finished" podID="e16bf0f3-533b-4114-89c6-195a85273e98" containerID="3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50" exitCode=1 Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.984193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerDied","Data":"3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50"} Mar 11 00:56:59 crc kubenswrapper[4744]: I0311 00:56:59.984998 4744 scope.go:117] "RemoveContainer" containerID="3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.012206 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.049564 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.074179 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.095851 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.114187 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.132271 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.146972 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.160955 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.176719 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.193096 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.208812 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.221697 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.238932 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.270924 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.290068 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.307315 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.326083 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.343075 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:00Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.974401 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:00 crc kubenswrapper[4744]: E0311 00:57:00.974642 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.992113 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/0.log" Mar 11 00:57:00 crc kubenswrapper[4744]: I0311 00:57:00.992209 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerStarted","Data":"0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5"} Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.019141 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.036289 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.060966 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.079970 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.100052 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.117391 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.135724 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.151805 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.167658 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.187573 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.201006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.216142 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.237803 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.256854 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.275686 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.292229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.316487 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.348851 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:01Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.974888 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.975014 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:01 crc kubenswrapper[4744]: I0311 00:57:01.974888 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:01 crc kubenswrapper[4744]: E0311 00:57:01.975145 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:01 crc kubenswrapper[4744]: E0311 00:57:01.975249 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:01 crc kubenswrapper[4744]: E0311 00:57:01.975403 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:02 crc kubenswrapper[4744]: I0311 00:57:02.974374 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:02 crc kubenswrapper[4744]: E0311 00:57:02.974846 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:02 crc kubenswrapper[4744]: I0311 00:57:02.975039 4744 scope.go:117] "RemoveContainer" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.895481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.895964 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.895994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.896071 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.896160 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:03Z","lastTransitionTime":"2026-03-11T00:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.922636 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:03Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.926997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.927046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.927059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.927078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.927091 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:03Z","lastTransitionTime":"2026-03-11T00:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.950470 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:03Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.954844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.954882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.954908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.954928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.954941 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:03Z","lastTransitionTime":"2026-03-11T00:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.967181 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:03Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.971793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.971833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.971845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.971864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.971875 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:03Z","lastTransitionTime":"2026-03-11T00:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.973996 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.974074 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.974108 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.974142 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.974263 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.974386 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:03 crc kubenswrapper[4744]: E0311 00:57:03.988148 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:03Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.993831 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:03Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.995327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.995367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.995378 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.995400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:03 crc kubenswrapper[4744]: I0311 00:57:03.995417 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:03Z","lastTransitionTime":"2026-03-11T00:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.005689 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.011747 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/2.log" Mar 11 00:57:04 crc kubenswrapper[4744]: E0311 00:57:04.012288 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: E0311 00:57:04.012714 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.014221 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.015871 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.016461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.025939 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.039741 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.052890 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.069555 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.089254 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: E0311 00:57:04.090769 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.101483 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.117399 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.140353 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.157362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.171543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.192208 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.207154 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.232059 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.248932 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.269343 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.289896 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.310955 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.331618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.355719 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.371952 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.394718 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.412236 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.432184 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.449700 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.466228 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.483603 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.504441 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.526772 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.546597 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.568344 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.603920 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.633033 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.658575 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.685232 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423996ee-605e-4e25-8f5c-d4b717716b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1cfbe4f99dbf1bd16e0cd6732534be630dc6127842dcbf6ee0efd6d3b9d673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852392ca4fef569658b15e593c3c30b1079a7f59b765ca2d9fc658c42249586a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee528bcd7bebc942b3ac8d59654402bb3629e259d6ba425f763feb6103f0255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7e9bfb431d0fee5860ee8602254d4c66bac6529691d8f2639598cf30509d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb483b36dc9e6541351c051052844f192758b53d5ef7363a0b924be57ce9ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.714684 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:04Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:04 crc kubenswrapper[4744]: I0311 00:57:04.973863 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:04 crc kubenswrapper[4744]: E0311 00:57:04.974126 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.022381 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/3.log" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.023711 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/2.log" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.028994 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" exitCode=1 Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.029086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.029174 4744 scope.go:117] "RemoveContainer" containerID="92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.030258 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 00:57:05 crc kubenswrapper[4744]: E0311 00:57:05.030795 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.045705 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.062071 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.080301 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.100666 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.120305 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.138908 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.164627 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.197331 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1336c58d3eb24389590807921be9410d32af1b1416d2091b4d515cfd6dfcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:38Z\\\",\\\"message\\\":\\\"hift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 00:56:37.981791 7053 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI0311 00:56:37.982673 7053 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 880.837µs\\\\nI0311 00:56:37.982410 7053 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0311 00:56:37.982645 7053 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 4.472262ms\\\\nI0311 00:56:37.983070 7053 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0311 00:56:37.983162 7053 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0311 00:56:37.983204 7053 ovnkube.go:599] Stopped ovnkube\\\\nI0311 00:56:37.983237 7053 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 00:56:37.983343 7053 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:57:04Z\\\",\\\"message\\\":\\\".978646 7326 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978789 7326 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978970 7326 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:57:03.979253 7326 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979372 7326 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979446 7326 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:57:03.980617 7326 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:57:03.980642 7326 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:57:03.980675 7326 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:57:03.980679 7326 factory.go:656] Stopping watch factory\\\\nI0311 00:57:03.980695 7326 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.230477 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423996ee-605e-4e25-8f5c-d4b717716b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1cfbe4f99dbf1bd16e0cd6732534be630dc6127842dcbf6ee0efd6d3b9d673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852392ca4fef569658b15e593c3c30b1079a7f59b765ca2d9fc658c42249586a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee528bcd7bebc942b3ac8d59654402bb3629e259d6ba425f763feb6103f0255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7e9bfb431d0fee5860ee8602254d4c66bac6529691d8f2639598cf30509d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb483b36dc9e6541351c051052844f192758b53d5ef7363a0b924be57ce9ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.250207 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.270024 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.293482 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.309829 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.331857 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.350062 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.370247 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.389844 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.411038 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.427805 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:05Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.974263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.974458 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:05 crc kubenswrapper[4744]: I0311 00:57:05.974602 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:05 crc kubenswrapper[4744]: E0311 00:57:05.974626 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:05 crc kubenswrapper[4744]: E0311 00:57:05.974700 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:05 crc kubenswrapper[4744]: E0311 00:57:05.974819 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.035469 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/3.log" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.042152 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 00:57:06 crc kubenswrapper[4744]: E0311 00:57:06.042703 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.076145 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423996ee-605e-4e25-8f5c-d4b717716b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1cfbe4f99dbf1bd16e0cd6732534be630dc6127842dcbf6ee0efd6d3b9d673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852392ca4fef569658b15e593c3c30b1079a7f59b765ca2d9fc658c42249586a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee528bcd7bebc942b3ac8d59654402bb3629e259d6ba425f763feb6103f0255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7e9bfb431d0fee5860ee8602254d4c66bac6529691d8f2639598cf30509d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb483b36dc9e6541351c051052844f192758b53d5ef7363a0b924be57ce9ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.099249 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.120493 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.138708 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.154477 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.172999 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.195672 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:57:04Z\\\",\\\"message\\\":\\\".978646 7326 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978789 7326 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978970 7326 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:57:03.979253 7326 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979372 7326 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979446 7326 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:57:03.980617 7326 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:57:03.980642 7326 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:57:03.980675 7326 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:57:03.980679 7326 factory.go:656] Stopping watch factory\\\\nI0311 00:57:03.980695 7326 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:57:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.213947 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.228103 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.246814 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.263989 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.285378 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.300845 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.321123 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.340015 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.359249 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.385357 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.401928 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.422807 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:06Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:06 crc kubenswrapper[4744]: I0311 00:57:06.974322 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:06 crc kubenswrapper[4744]: E0311 00:57:06.974569 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:07 crc kubenswrapper[4744]: I0311 00:57:07.974710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:07 crc kubenswrapper[4744]: I0311 00:57:07.974754 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:07 crc kubenswrapper[4744]: E0311 00:57:07.974939 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:07 crc kubenswrapper[4744]: I0311 00:57:07.974755 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:07 crc kubenswrapper[4744]: E0311 00:57:07.975068 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:07 crc kubenswrapper[4744]: E0311 00:57:07.975182 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:08 crc kubenswrapper[4744]: I0311 00:57:08.974250 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:08 crc kubenswrapper[4744]: E0311 00:57:08.974484 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:09 crc kubenswrapper[4744]: E0311 00:57:09.093041 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:09 crc kubenswrapper[4744]: I0311 00:57:09.974358 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:09 crc kubenswrapper[4744]: E0311 00:57:09.974625 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:09 crc kubenswrapper[4744]: I0311 00:57:09.974643 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:09 crc kubenswrapper[4744]: I0311 00:57:09.974692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:09 crc kubenswrapper[4744]: E0311 00:57:09.974869 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:09 crc kubenswrapper[4744]: E0311 00:57:09.975095 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:09 crc kubenswrapper[4744]: I0311 00:57:09.975967 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:57:09 crc kubenswrapper[4744]: E0311 00:57:09.976264 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 00:57:10 crc kubenswrapper[4744]: I0311 00:57:10.974565 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:10 crc kubenswrapper[4744]: E0311 00:57:10.974788 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:11 crc kubenswrapper[4744]: I0311 00:57:11.974742 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:11 crc kubenswrapper[4744]: I0311 00:57:11.974810 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:11 crc kubenswrapper[4744]: E0311 00:57:11.975020 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:11 crc kubenswrapper[4744]: I0311 00:57:11.975164 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:11 crc kubenswrapper[4744]: E0311 00:57:11.975349 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:11 crc kubenswrapper[4744]: E0311 00:57:11.975486 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:12 crc kubenswrapper[4744]: I0311 00:57:12.974390 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:12 crc kubenswrapper[4744]: E0311 00:57:12.975120 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:13 crc kubenswrapper[4744]: I0311 00:57:13.978864 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:13 crc kubenswrapper[4744]: I0311 00:57:13.978980 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:13 crc kubenswrapper[4744]: E0311 00:57:13.979104 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:13 crc kubenswrapper[4744]: I0311 00:57:13.979221 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:13 crc kubenswrapper[4744]: E0311 00:57:13.979437 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:13 crc kubenswrapper[4744]: E0311 00:57:13.979665 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:13 crc kubenswrapper[4744]: I0311 00:57:13.998485 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6ghqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e146399d-0685-4ea8-96e4-c76c9478a23a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8ff476d73855cbb3da64ae999f54bad2263813a70dba1aa61507b53102d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8h2vm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6ghqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:13Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.024760 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac70f681-9bcf-4f8c-a175-ed7d4e9da471\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafb45aea7f19d6db5ddf62115fb453eebda33f0be934b59e40083cd8387c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09583de802dbeffc06642379e30e628cf6541f82d5a11caa554cdf17e408476b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a370bc02fc3d53e08d44ce24a85d1e5b20f06a4933b61c8fef04aff8cfe6081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad4e761e6f478df27c55306ecb235f8cbbe0954433cbe7a029d585d609a08db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96ebe5cbd3477966cb49d08397fee1954529001d0bede507c64ca71c25786a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd8c4d24a706d746c84147c617f511da67e77e20235bb94b3b365552ef8c23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81090de2003b39fdaee05a64219b7ebefa57068b2682e821d55ea9615ba67d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqrjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sj4cl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.063142 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff04e11-e747-44c5-b049-371a5d422157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:57:04Z\\\",\\\"message\\\":\\\".978646 7326 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978789 7326 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.978970 7326 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 00:57:03.979253 7326 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979372 7326 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 00:57:03.979446 7326 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0311 00:57:03.980617 7326 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 00:57:03.980642 7326 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 00:57:03.980675 7326 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 00:57:03.980679 7326 factory.go:656] Stopping watch factory\\\\nI0311 00:57:03.980695 7326 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:57:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fr9zj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-78fcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.094285 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.100990 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423996ee-605e-4e25-8f5c-d4b717716b5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1cfbe4f99dbf1bd16e0cd6732534be630dc6127842dcbf6ee0efd6d3b9d673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852392ca4fef569658b15e593c3c30b1079a7f59b765ca2d9fc658c42249586a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee528bcd7bebc942b3ac8d59654402bb3629e259d6ba425f763feb6103f0255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7e9bfb431d0fee5860ee8602254d4c66bac6529691d8f2639598cf30509d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb483b36dc9e6541351c051052844f192758b53d5ef7363a0b924be57ce9ca0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28d9226ae39a5ef12c0715e064e540ee7f14b855833284b8a5af039f2c663a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9cf6603d67dc75a034fdbdfa82ce4551e42b5ef1c535f8d42c4cb3bc7b3d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bccf1928ef633d8209fbff4bb71b38f77361ca43e852c5cfa8529a4b4031e245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.108254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.108312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.108325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.108345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.108359 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:14Z","lastTransitionTime":"2026-03-11T00:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.124272 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.127841 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f7b8faf68f817bd772c7988a46ffcc286551806463dc59776e5c6e18deb8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.130015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.130098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.130126 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.130156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.130176 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:14Z","lastTransitionTime":"2026-03-11T00:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.148121 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.152693 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.161461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.161552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.161573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.161600 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.161618 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:14Z","lastTransitionTime":"2026-03-11T00:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.173203 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xlclh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e16bf0f3-533b-4114-89c6-195a85273e98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T00:56:59Z\\\",\\\"message\\\":\\\"2026-03-11T00:56:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8\\\\n2026-03-11T00:56:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2f2d951-393c-4c01-b807-b3f13342c8c8 to /host/opt/cni/bin/\\\\n2026-03-11T00:56:14Z [verbose] multus-daemon started\\\\n2026-03-11T00:56:14Z [verbose] Readiness Indicator file check\\\\n2026-03-11T00:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgqmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xlclh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.179771 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.184849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.184914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.184932 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.184960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.184977 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:14Z","lastTransitionTime":"2026-03-11T00:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.196418 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e522f1d8-5329-414c-88d5-79e6f3b615be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:55:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0311 00:55:54.578734 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 00:55:54.579118 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 00:55:54.580623 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3889625742/tls.crt::/tmp/serving-cert-3889625742/tls.key\\\\\\\"\\\\nI0311 00:55:54.806131 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 00:55:54.812332 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 00:55:54.812369 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 00:55:54.812401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 00:55:54.812411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 00:55:54.819161 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0311 00:55:54.819196 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 00:55:54.819211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 00:55:54.819235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 00:55:54.819243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 00:55:54.819248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 00:55:54.819254 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 00:55:54.821179 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:55:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.206238 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.210960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.211028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.211051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.211079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.211100 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:14Z","lastTransitionTime":"2026-03-11T00:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.216587 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d4e3bc-7c73-40d6-9bde-5178882a794d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db74ff70ae81764297d0f56ae2c56f33d40d9e0025aacba1f56045eae524b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://779c97b306c3647569a5bade5ab4a8ca49f1fa84ded5fa830235ab1d4a039038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.228292 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T00:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"51e320bc-e184-46b0-b151-baf1fef55472\\\",\\\"systemUUID\\\":\\\"b27597af-c36d-4084-a073-6dfdbb017181\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.228559 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.236259 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.253146 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92f4c9df-1087-4820-8e07-1120f02df454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34eabb0e2af63857d40e20376f7979131fef683885949c1ea75e0a4949968872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bfa60a94390dd23da621dd81523dde8107fe6f3beba45ad1b7a96c0d6cd4f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkpvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rplbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.272861 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://064a29fa0ed4ecddcf11aa3175a3fc935147341c372af46f94eee470a9ba363c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4d16fc0af970b13be0d202345567993d215bf342174b1b568c4b4ac26c0017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.290376 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th54b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tdnf7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.307437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8z8gf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad8329c6-d511-446f-b617-99778d12b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27268ce5ab229a693eeb80e4dfc38c14f18ad32dc201ace682b5516070c52fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7gczk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8z8gf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.325633 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a15dc7ac-7c34-4135-b6eb-a85122800ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d16f6053ae793fbf194d5cbb556a8603bae74b52c9f07aa937dbfd654e3894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9t5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:56:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-678nx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.354938 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a87b2ce-541f-40ba-a568-93914e7dea62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19989b1018d06da0230a62c4c231865e9d16267d0eb570a409da006b246005b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://832be2ac0e5b4b69727f022a0663974a495e008d3a781fa4d237d3347d39c154\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T00:54:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 00:54:06.849622 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 00:54:06.852083 1 observer_polling.go:159] Starting file observer\\\\nI0311 00:54:06.910593 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 00:54:06.914052 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0311 00:54:37.381234 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:54:36Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82831ac0a065cf2ea80ca05df203fc6de89ad279faae95364ab2519b689453d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4d36508e362c9b7f7365ade7dda3dec0c70f9df1be158a034aeaaa772c797e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.373593 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d341ff-c992-4426-acea-a34e4a61fa6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T00:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01519860f21aa252a579dd461cd32233ed13bfb46d851c1c19410e2632ad4389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3b81c668c0cd9ccdc04d171476dbbbe2bca625de84ad8aff48be69bbaa46b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fba0fd93d322bd73e9140c40efe4dd1469525a5d6d26a0d1e32ba8e0920a2a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9690b251bd56f0d091b5ae52e71e94528ba6afd0e61919398662fdadd4a9f89f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T00:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T00:54:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T00:54:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.391985 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff1516676048ec93cb3569f8720eb042036b92afc1c65f1c58b15fb604f58b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T00:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.411986 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T00:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T00:57:14Z is after 2025-08-24T17:21:41Z" Mar 11 00:57:14 crc kubenswrapper[4744]: I0311 00:57:14.973843 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:14 crc kubenswrapper[4744]: E0311 00:57:14.974182 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.762357 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.762328048 +0000 UTC m=+256.566545693 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.762207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.762604 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.762813 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.762838 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.762858 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.762932 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.762914346 +0000 UTC m=+256.567132011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.763240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.763378 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.763442 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.763425081 +0000 UTC m=+256.567642726 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.763756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.763866 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.763926 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.763911175 +0000 UTC m=+256.568128820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.865871 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.865934 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.865962 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.866058 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.866028724 +0000 UTC m=+256.670246359 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.865668 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.866480 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.866627 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.866683 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs podName:7aeb1578-fe93-4bec-8f43-17d0923fa5c0 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:19.866667143 +0000 UTC m=+256.670884788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs") pod "network-metrics-daemon-tdnf7" (UID: "7aeb1578-fe93-4bec-8f43-17d0923fa5c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.974289 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.974418 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.974646 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:15 crc kubenswrapper[4744]: I0311 00:57:15.974736 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.974908 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:15 crc kubenswrapper[4744]: E0311 00:57:15.975033 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:16 crc kubenswrapper[4744]: I0311 00:57:16.974422 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:16 crc kubenswrapper[4744]: E0311 00:57:16.974751 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:17 crc kubenswrapper[4744]: I0311 00:57:17.974264 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:17 crc kubenswrapper[4744]: I0311 00:57:17.974360 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:17 crc kubenswrapper[4744]: I0311 00:57:17.974306 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:17 crc kubenswrapper[4744]: E0311 00:57:17.974615 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:17 crc kubenswrapper[4744]: E0311 00:57:17.974793 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:17 crc kubenswrapper[4744]: E0311 00:57:17.974940 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:18 crc kubenswrapper[4744]: I0311 00:57:18.974736 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:18 crc kubenswrapper[4744]: E0311 00:57:18.974954 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:19 crc kubenswrapper[4744]: E0311 00:57:19.096543 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:19 crc kubenswrapper[4744]: I0311 00:57:19.974285 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:19 crc kubenswrapper[4744]: I0311 00:57:19.974394 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:19 crc kubenswrapper[4744]: I0311 00:57:19.974474 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:19 crc kubenswrapper[4744]: E0311 00:57:19.974739 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:19 crc kubenswrapper[4744]: E0311 00:57:19.974875 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:19 crc kubenswrapper[4744]: E0311 00:57:19.975044 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:20 crc kubenswrapper[4744]: I0311 00:57:20.974307 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:20 crc kubenswrapper[4744]: E0311 00:57:20.974745 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:20 crc kubenswrapper[4744]: I0311 00:57:20.976413 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 00:57:20 crc kubenswrapper[4744]: E0311 00:57:20.976849 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:57:21 crc kubenswrapper[4744]: I0311 00:57:21.974735 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:21 crc kubenswrapper[4744]: I0311 00:57:21.974799 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:21 crc kubenswrapper[4744]: I0311 00:57:21.974799 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:21 crc kubenswrapper[4744]: E0311 00:57:21.975056 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:21 crc kubenswrapper[4744]: E0311 00:57:21.975164 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:21 crc kubenswrapper[4744]: E0311 00:57:21.975358 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:22 crc kubenswrapper[4744]: I0311 00:57:22.974798 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:22 crc kubenswrapper[4744]: E0311 00:57:22.975722 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:23 crc kubenswrapper[4744]: I0311 00:57:23.973789 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:23 crc kubenswrapper[4744]: E0311 00:57:23.973979 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:23 crc kubenswrapper[4744]: I0311 00:57:23.974082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:23 crc kubenswrapper[4744]: I0311 00:57:23.974093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:23 crc kubenswrapper[4744]: E0311 00:57:23.974631 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:23 crc kubenswrapper[4744]: E0311 00:57:23.974729 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:23 crc kubenswrapper[4744]: I0311 00:57:23.976239 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.036136 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=21.03610843 podStartE2EDuration="21.03610843s" podCreationTimestamp="2026-03-11 00:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.03413414 +0000 UTC m=+200.838351785" watchObservedRunningTime="2026-03-11 00:57:24.03610843 +0000 UTC m=+200.840326075" Mar 11 00:57:24 crc kubenswrapper[4744]: E0311 00:57:24.098371 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.114822 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xlclh" podStartSLOduration=139.114804583 podStartE2EDuration="2m19.114804583s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.108998905 +0000 UTC m=+200.913216510" watchObservedRunningTime="2026-03-11 00:57:24.114804583 +0000 UTC m=+200.919022198" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.126354 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6ghqv" podStartSLOduration=139.126338249 podStartE2EDuration="2m19.126338249s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.126020729 +0000 UTC m=+200.930238334" watchObservedRunningTime="2026-03-11 00:57:24.126338249 +0000 UTC m=+200.930555864" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.154036 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sj4cl" podStartSLOduration=139.154018631 podStartE2EDuration="2m19.154018631s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.152128052 +0000 UTC m=+200.956345667" watchObservedRunningTime="2026-03-11 00:57:24.154018631 +0000 UTC m=+200.958236236" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.212940 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=54.212912474 podStartE2EDuration="54.212912474s" podCreationTimestamp="2026-03-11 00:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.211974695 +0000 UTC m=+201.016192380" watchObservedRunningTime="2026-03-11 00:57:24.212912474 +0000 UTC m=+201.017130119" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.244227 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rplbw" podStartSLOduration=138.244205678 podStartE2EDuration="2m18.244205678s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.243784845 +0000 UTC m=+201.048002450" watchObservedRunningTime="2026-03-11 00:57:24.244205678 +0000 UTC m=+201.048423283" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.294471 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.294449355 podStartE2EDuration="1m13.294449355s" podCreationTimestamp="2026-03-11 00:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.293823456 +0000 UTC m=+201.098041091" watchObservedRunningTime="2026-03-11 00:57:24.294449355 +0000 UTC m=+201.098666970" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.310923 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=62.310894661 podStartE2EDuration="1m2.310894661s" podCreationTimestamp="2026-03-11 00:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.310235661 +0000 UTC m=+201.114453276" watchObservedRunningTime="2026-03-11 00:57:24.310894661 +0000 UTC m=+201.115112276" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.361368 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8z8gf" podStartSLOduration=139.361346875 podStartE2EDuration="2m19.361346875s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.360094246 +0000 UTC m=+201.164311851" watchObservedRunningTime="2026-03-11 00:57:24.361346875 +0000 UTC m=+201.165564490" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.375242 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podStartSLOduration=139.375221092 podStartE2EDuration="2m19.375221092s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:24.374312684 +0000 UTC m=+201.178530319" watchObservedRunningTime="2026-03-11 00:57:24.375221092 +0000 UTC m=+201.179438707" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.481571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.481666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.481685 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.481744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.481763 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T00:57:24Z","lastTransitionTime":"2026-03-11T00:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.554711 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q"] Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.555899 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.559568 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.560850 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.563816 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.564047 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.677657 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.677710 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3216516-0566-403a-ad05-8a14e1f52bf4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.677758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3216516-0566-403a-ad05-8a14e1f52bf4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.677816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3216516-0566-403a-ad05-8a14e1f52bf4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.677830 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.779715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.779823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3216516-0566-403a-ad05-8a14e1f52bf4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.779899 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.779976 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3216516-0566-403a-ad05-8a14e1f52bf4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.780109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.780164 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3216516-0566-403a-ad05-8a14e1f52bf4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.780243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b3216516-0566-403a-ad05-8a14e1f52bf4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.783459 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3216516-0566-403a-ad05-8a14e1f52bf4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.792688 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3216516-0566-403a-ad05-8a14e1f52bf4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.803319 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3216516-0566-403a-ad05-8a14e1f52bf4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c275q\" (UID: \"b3216516-0566-403a-ad05-8a14e1f52bf4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.881532 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" Mar 11 00:57:24 crc kubenswrapper[4744]: W0311 00:57:24.897684 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3216516_0566_403a_ad05_8a14e1f52bf4.slice/crio-1cf038f0bb95ebab883f85544f7403e2d7aa46cff8fc9b18f9bd27530130a780 WatchSource:0}: Error finding container 1cf038f0bb95ebab883f85544f7403e2d7aa46cff8fc9b18f9bd27530130a780: Status 404 returned error can't find the container with id 1cf038f0bb95ebab883f85544f7403e2d7aa46cff8fc9b18f9bd27530130a780 Mar 11 00:57:24 crc kubenswrapper[4744]: I0311 00:57:24.974571 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:24 crc kubenswrapper[4744]: E0311 00:57:24.974783 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.001126 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.010924 4744 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.126660 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" event={"ID":"b3216516-0566-403a-ad05-8a14e1f52bf4","Type":"ContainerStarted","Data":"34eecde1cff647bd0d9d2b9a7cdad00cc7ae36fea81297861d89a25d34e9353c"} Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.126741 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" event={"ID":"b3216516-0566-403a-ad05-8a14e1f52bf4","Type":"ContainerStarted","Data":"1cf038f0bb95ebab883f85544f7403e2d7aa46cff8fc9b18f9bd27530130a780"} Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.131810 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.135999 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab"} Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.136574 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.152271 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c275q" podStartSLOduration=140.152240137 podStartE2EDuration="2m20.152240137s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:25.151218106 +0000 UTC m=+201.955435751" watchObservedRunningTime="2026-03-11 00:57:25.152240137 +0000 UTC m=+201.956457772" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.179995 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.179975501 podStartE2EDuration="1m7.179975501s" podCreationTimestamp="2026-03-11 00:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:25.17894138 +0000 UTC m=+201.983159025" watchObservedRunningTime="2026-03-11 00:57:25.179975501 +0000 UTC m=+201.984193126" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.974456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.974456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:25 crc kubenswrapper[4744]: E0311 00:57:25.974705 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:25 crc kubenswrapper[4744]: E0311 00:57:25.974811 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:25 crc kubenswrapper[4744]: I0311 00:57:25.975109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:25 crc kubenswrapper[4744]: E0311 00:57:25.975241 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:26 crc kubenswrapper[4744]: I0311 00:57:26.974318 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:26 crc kubenswrapper[4744]: E0311 00:57:26.974575 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:27 crc kubenswrapper[4744]: I0311 00:57:27.973814 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:27 crc kubenswrapper[4744]: I0311 00:57:27.973923 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:27 crc kubenswrapper[4744]: I0311 00:57:27.973814 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:27 crc kubenswrapper[4744]: E0311 00:57:27.974031 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:27 crc kubenswrapper[4744]: E0311 00:57:27.974147 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:27 crc kubenswrapper[4744]: E0311 00:57:27.974248 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:28 crc kubenswrapper[4744]: I0311 00:57:28.974798 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:28 crc kubenswrapper[4744]: E0311 00:57:28.975027 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:29 crc kubenswrapper[4744]: E0311 00:57:29.099457 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:29 crc kubenswrapper[4744]: I0311 00:57:29.973987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:29 crc kubenswrapper[4744]: I0311 00:57:29.974135 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:29 crc kubenswrapper[4744]: E0311 00:57:29.974162 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:29 crc kubenswrapper[4744]: E0311 00:57:29.974480 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:29 crc kubenswrapper[4744]: I0311 00:57:29.974614 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:29 crc kubenswrapper[4744]: E0311 00:57:29.974921 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:30 crc kubenswrapper[4744]: I0311 00:57:30.973739 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:30 crc kubenswrapper[4744]: E0311 00:57:30.973965 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:31 crc kubenswrapper[4744]: I0311 00:57:31.974484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:31 crc kubenswrapper[4744]: I0311 00:57:31.974865 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:31 crc kubenswrapper[4744]: I0311 00:57:31.975004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:31 crc kubenswrapper[4744]: E0311 00:57:31.975092 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:31 crc kubenswrapper[4744]: E0311 00:57:31.974949 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:31 crc kubenswrapper[4744]: E0311 00:57:31.975302 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:31 crc kubenswrapper[4744]: I0311 00:57:31.976683 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 00:57:31 crc kubenswrapper[4744]: E0311 00:57:31.976956 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-78fcc_openshift-ovn-kubernetes(6ff04e11-e747-44c5-b049-371a5d422157)\"" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" Mar 11 00:57:32 crc kubenswrapper[4744]: I0311 00:57:32.974130 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:32 crc kubenswrapper[4744]: E0311 00:57:32.974852 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:33 crc kubenswrapper[4744]: I0311 00:57:33.974136 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:33 crc kubenswrapper[4744]: I0311 00:57:33.974226 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:33 crc kubenswrapper[4744]: I0311 00:57:33.974269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:33 crc kubenswrapper[4744]: E0311 00:57:33.976092 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:33 crc kubenswrapper[4744]: E0311 00:57:33.976254 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:33 crc kubenswrapper[4744]: E0311 00:57:33.976410 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:34 crc kubenswrapper[4744]: E0311 00:57:34.100723 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:34 crc kubenswrapper[4744]: I0311 00:57:34.974007 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:34 crc kubenswrapper[4744]: E0311 00:57:34.974222 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:35 crc kubenswrapper[4744]: I0311 00:57:35.974002 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:35 crc kubenswrapper[4744]: I0311 00:57:35.974128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:35 crc kubenswrapper[4744]: E0311 00:57:35.974217 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:35 crc kubenswrapper[4744]: I0311 00:57:35.974128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:35 crc kubenswrapper[4744]: E0311 00:57:35.974417 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:35 crc kubenswrapper[4744]: E0311 00:57:35.974692 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:36 crc kubenswrapper[4744]: I0311 00:57:36.973781 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:36 crc kubenswrapper[4744]: E0311 00:57:36.974036 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:37 crc kubenswrapper[4744]: I0311 00:57:37.974350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:37 crc kubenswrapper[4744]: E0311 00:57:37.974598 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:37 crc kubenswrapper[4744]: I0311 00:57:37.974901 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:37 crc kubenswrapper[4744]: E0311 00:57:37.974965 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:37 crc kubenswrapper[4744]: I0311 00:57:37.975213 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:37 crc kubenswrapper[4744]: E0311 00:57:37.975277 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:38 crc kubenswrapper[4744]: I0311 00:57:38.974400 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:38 crc kubenswrapper[4744]: E0311 00:57:38.974871 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:39 crc kubenswrapper[4744]: E0311 00:57:39.102202 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:39 crc kubenswrapper[4744]: I0311 00:57:39.974690 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:39 crc kubenswrapper[4744]: E0311 00:57:39.974903 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:39 crc kubenswrapper[4744]: I0311 00:57:39.975006 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:39 crc kubenswrapper[4744]: E0311 00:57:39.975196 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:39 crc kubenswrapper[4744]: I0311 00:57:39.975271 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:39 crc kubenswrapper[4744]: E0311 00:57:39.975479 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:40 crc kubenswrapper[4744]: I0311 00:57:40.974400 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:40 crc kubenswrapper[4744]: E0311 00:57:40.974725 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:41 crc kubenswrapper[4744]: I0311 00:57:41.974820 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:41 crc kubenswrapper[4744]: I0311 00:57:41.974845 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:41 crc kubenswrapper[4744]: E0311 00:57:41.975073 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:41 crc kubenswrapper[4744]: I0311 00:57:41.974845 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:41 crc kubenswrapper[4744]: E0311 00:57:41.975199 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:41 crc kubenswrapper[4744]: E0311 00:57:41.975267 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:42 crc kubenswrapper[4744]: I0311 00:57:42.661135 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:57:42 crc kubenswrapper[4744]: I0311 00:57:42.974280 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:42 crc kubenswrapper[4744]: E0311 00:57:42.974507 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:43 crc kubenswrapper[4744]: I0311 00:57:43.974345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:43 crc kubenswrapper[4744]: I0311 00:57:43.974426 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:43 crc kubenswrapper[4744]: E0311 00:57:43.977597 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:43 crc kubenswrapper[4744]: I0311 00:57:43.977630 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:43 crc kubenswrapper[4744]: E0311 00:57:43.978058 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:43 crc kubenswrapper[4744]: E0311 00:57:43.977748 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:44 crc kubenswrapper[4744]: E0311 00:57:44.103443 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:44 crc kubenswrapper[4744]: I0311 00:57:44.974030 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:44 crc kubenswrapper[4744]: E0311 00:57:44.974251 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:44 crc kubenswrapper[4744]: I0311 00:57:44.975348 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.228302 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/3.log" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.231682 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerStarted","Data":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.233241 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.925020 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podStartSLOduration=160.924998878 podStartE2EDuration="2m40.924998878s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:57:45.273772106 +0000 UTC m=+222.077989801" watchObservedRunningTime="2026-03-11 00:57:45.924998878 +0000 UTC m=+222.729216493" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.925392 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tdnf7"] Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.925539 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:45 crc kubenswrapper[4744]: E0311 00:57:45.925649 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.974372 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.974452 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:45 crc kubenswrapper[4744]: I0311 00:57:45.974379 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:45 crc kubenswrapper[4744]: E0311 00:57:45.974612 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:45 crc kubenswrapper[4744]: E0311 00:57:45.974773 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:45 crc kubenswrapper[4744]: E0311 00:57:45.974896 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.240407 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/1.log" Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.241257 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/0.log" Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.241325 4744 generic.go:334] "Generic (PLEG): container finished" podID="e16bf0f3-533b-4114-89c6-195a85273e98" containerID="0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5" exitCode=1 Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.241428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerDied","Data":"0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5"} Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.241538 4744 scope.go:117] "RemoveContainer" containerID="3cf2ef2701f6c6e8715d3c41c3d7c4d704d6f03989831a0c3d34e92ab7a6dd50" Mar 11 00:57:46 crc kubenswrapper[4744]: I0311 00:57:46.242041 4744 scope.go:117] "RemoveContainer" containerID="0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5" Mar 11 00:57:46 crc kubenswrapper[4744]: E0311 00:57:46.242336 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xlclh_openshift-multus(e16bf0f3-533b-4114-89c6-195a85273e98)\"" pod="openshift-multus/multus-xlclh" podUID="e16bf0f3-533b-4114-89c6-195a85273e98" Mar 11 00:57:47 crc kubenswrapper[4744]: I0311 00:57:47.247990 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/1.log" Mar 11 00:57:47 crc kubenswrapper[4744]: I0311 00:57:47.974371 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:47 crc kubenswrapper[4744]: I0311 00:57:47.974507 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:47 crc kubenswrapper[4744]: E0311 00:57:47.974635 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:47 crc kubenswrapper[4744]: I0311 00:57:47.974684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:47 crc kubenswrapper[4744]: E0311 00:57:47.974866 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:47 crc kubenswrapper[4744]: I0311 00:57:47.975082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:47 crc kubenswrapper[4744]: E0311 00:57:47.975064 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:47 crc kubenswrapper[4744]: E0311 00:57:47.975181 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:49 crc kubenswrapper[4744]: E0311 00:57:49.105470 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:49 crc kubenswrapper[4744]: I0311 00:57:49.974433 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:49 crc kubenswrapper[4744]: I0311 00:57:49.974562 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:49 crc kubenswrapper[4744]: I0311 00:57:49.974569 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:49 crc kubenswrapper[4744]: I0311 00:57:49.974460 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:49 crc kubenswrapper[4744]: E0311 00:57:49.974735 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:49 crc kubenswrapper[4744]: E0311 00:57:49.974901 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:49 crc kubenswrapper[4744]: E0311 00:57:49.974984 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:49 crc kubenswrapper[4744]: E0311 00:57:49.975139 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:51 crc kubenswrapper[4744]: I0311 00:57:51.974372 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:51 crc kubenswrapper[4744]: I0311 00:57:51.974392 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:51 crc kubenswrapper[4744]: E0311 00:57:51.975139 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:51 crc kubenswrapper[4744]: I0311 00:57:51.974641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:51 crc kubenswrapper[4744]: I0311 00:57:51.974466 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:51 crc kubenswrapper[4744]: E0311 00:57:51.975347 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:51 crc kubenswrapper[4744]: E0311 00:57:51.975488 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:51 crc kubenswrapper[4744]: E0311 00:57:51.975718 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:53 crc kubenswrapper[4744]: I0311 00:57:53.973892 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:53 crc kubenswrapper[4744]: I0311 00:57:53.973943 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:53 crc kubenswrapper[4744]: I0311 00:57:53.973979 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:53 crc kubenswrapper[4744]: E0311 00:57:53.975830 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:53 crc kubenswrapper[4744]: I0311 00:57:53.975925 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:53 crc kubenswrapper[4744]: E0311 00:57:53.976218 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:53 crc kubenswrapper[4744]: E0311 00:57:53.976266 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:53 crc kubenswrapper[4744]: E0311 00:57:53.976354 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:54 crc kubenswrapper[4744]: E0311 00:57:54.107252 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 00:57:55 crc kubenswrapper[4744]: I0311 00:57:55.974223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:55 crc kubenswrapper[4744]: I0311 00:57:55.974382 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:55 crc kubenswrapper[4744]: E0311 00:57:55.974507 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:55 crc kubenswrapper[4744]: I0311 00:57:55.974650 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:55 crc kubenswrapper[4744]: E0311 00:57:55.975022 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:55 crc kubenswrapper[4744]: I0311 00:57:55.975109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:55 crc kubenswrapper[4744]: E0311 00:57:55.975169 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:55 crc kubenswrapper[4744]: E0311 00:57:55.975286 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:56 crc kubenswrapper[4744]: I0311 00:57:56.974813 4744 scope.go:117] "RemoveContainer" containerID="0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5" Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.303809 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/1.log" Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.303923 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerStarted","Data":"7af721d68eedfb76de378529ad9a2fb23d33e7a1d6d37b9abb8763fe0d9087f1"} Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.974734 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.974794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:57 crc kubenswrapper[4744]: E0311 00:57:57.975340 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.974883 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:57 crc kubenswrapper[4744]: I0311 00:57:57.974853 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:57 crc kubenswrapper[4744]: E0311 00:57:57.975545 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 00:57:57 crc kubenswrapper[4744]: E0311 00:57:57.975692 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 00:57:57 crc kubenswrapper[4744]: E0311 00:57:57.975817 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnf7" podUID="7aeb1578-fe93-4bec-8f43-17d0923fa5c0" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.974719 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.974791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.974803 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.975977 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.977767 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.977774 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.980636 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.980639 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.981232 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 00:57:59 crc kubenswrapper[4744]: I0311 00:57:59.981499 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.814484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.886809 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.888072 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.889247 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.890017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.893647 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29553120-6zw87"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.894626 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.895313 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txbh2"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.906858 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4jlsm"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.907105 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.907700 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.922238 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.923394 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.923440 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.924249 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.930694 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.931435 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.947852 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948237 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948349 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948482 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948629 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948729 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948839 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.948931 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.949026 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.949130 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.949229 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.950754 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.951022 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.951233 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.951441 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.951639 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.951744 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.952150 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.952732 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953021 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953240 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953401 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953034 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953667 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953826 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954107 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954199 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954283 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954313 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954390 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954454 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954633 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.953596 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954805 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954927 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.955022 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.955124 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.955200 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.955308 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.955383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.954463 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.958235 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-286h7"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.958683 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.958908 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wm2rt"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.959228 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.960003 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.960485 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.986561 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.987662 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:04 crc kubenswrapper[4744]: I0311 00:58:04.991791 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lws9c"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.011380 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.014380 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.015183 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ptbbd"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.015677 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.017327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.020480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wkr\" (UniqueName: \"kubernetes.io/projected/51238090-1fbe-446b-a63d-5ec9c3137c61-kube-api-access-k5wkr\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.020761 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021388 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021469 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-auth-proxy-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-image-import-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021563 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-audit\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021661 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-serving-cert\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021713 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-node-pullsecrets\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021764 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51238090-1fbe-446b-a63d-5ec9c3137c61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021789 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021820 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021844 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51238090-1fbe-446b-a63d-5ec9c3137c61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-audit-dir\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021955 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.021983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmsqr\" (UniqueName: \"kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022008 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022084 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-client\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022110 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjgb\" (UniqueName: \"kubernetes.io/projected/7bd67414-ec33-4963-b5d8-ac374bd28a6a-kube-api-access-tdjgb\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022174 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022205 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzxz\" (UniqueName: \"kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc45881-56ff-4010-8eda-103f41f90bc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fz4\" (UniqueName: \"kubernetes.io/projected/47244264-b9e7-4f86-85d7-5406ed8d8833-kube-api-access-h4fz4\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022309 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022590 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022861 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7bd67414-ec33-4963-b5d8-ac374bd28a6a-machine-approver-tls\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022890 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022933 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkf84\" (UniqueName: \"kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.022986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023014 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47244264-b9e7-4f86-85d7-5406ed8d8833-metrics-tls\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023074 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4m7z\" (UniqueName: \"kubernetes.io/projected/acc45881-56ff-4010-8eda-103f41f90bc5-kube-api-access-k4m7z\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023104 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023129 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpmz\" (UniqueName: \"kubernetes.io/projected/24ac204b-1627-404f-b33c-fc77ded356d1-kube-api-access-5fpmz\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023316 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-encryption-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023326 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023500 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023645 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023760 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023787 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023866 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023888 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.023968 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024060 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024157 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024258 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024351 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024438 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024731 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024890 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.024985 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.025063 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.028988 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.029176 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.029457 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.030975 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.032607 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.033074 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.033496 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.034283 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.040358 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.044182 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.044495 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.050339 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.052810 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.052986 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053032 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053075 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053159 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053217 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053253 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053344 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053410 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053604 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053689 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053696 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053769 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053849 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053902 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.053933 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054004 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054101 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054109 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054190 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054305 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054412 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.054560 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.056492 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.040210 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.063806 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.064274 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.064651 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.064911 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.064991 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.065261 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.065477 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.065632 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.065725 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.065928 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.066097 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.066310 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.066424 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.066462 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.067132 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.067231 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.067311 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.083246 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.084620 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.086726 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.093360 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.095897 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.096261 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.096481 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.097764 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.098962 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.100551 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.101026 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.101665 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.102689 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.104859 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.105261 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.105900 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.110672 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9dcwq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.111778 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.111921 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.112363 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85z54"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.112371 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.112420 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.113278 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t54km"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.113642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.113908 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.113976 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.114876 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7qgjq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.115315 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.116846 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.118685 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.118718 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553176-m7tmh"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.119460 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.120615 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.122732 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kx277"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.122888 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123860 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9njb\" (UniqueName: \"kubernetes.io/projected/de688bff-78ce-4d0f-ad7e-548ca640887a-kube-api-access-v9njb\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123903 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123929 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123955 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmsqr\" (UniqueName: \"kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123977 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.123998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124021 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124061 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-client\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124093 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whhv\" (UniqueName: \"kubernetes.io/projected/61dac64e-6219-46fe-80b0-420098bb260b-kube-api-access-8whhv\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124133 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjgb\" (UniqueName: \"kubernetes.io/projected/7bd67414-ec33-4963-b5d8-ac374bd28a6a-kube-api-access-tdjgb\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124173 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-config\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124195 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dp2v\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-kube-api-access-6dp2v\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-trusted-ca\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc45881-56ff-4010-8eda-103f41f90bc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124250 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-serving-cert\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzxz\" (UniqueName: \"kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124291 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124308 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-client\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124324 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dac64e-6219-46fe-80b0-420098bb260b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-config\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124378 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fz4\" (UniqueName: \"kubernetes.io/projected/47244264-b9e7-4f86-85d7-5406ed8d8833-kube-api-access-h4fz4\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124395 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-service-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124414 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk88x\" (UniqueName: \"kubernetes.io/projected/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-kube-api-access-rk88x\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124420 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124438 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-service-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-config\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7bd67414-ec33-4963-b5d8-ac374bd28a6a-machine-approver-tls\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124542 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124561 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0359d6fc-1139-4dbc-a50f-55fa91607935-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkf84\" (UniqueName: \"kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124619 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124639 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124677 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d1c92dd-43a7-4311-90b1-54441f84787e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124697 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47244264-b9e7-4f86-85d7-5406ed8d8833-metrics-tls\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4m7z\" (UniqueName: \"kubernetes.io/projected/acc45881-56ff-4010-8eda-103f41f90bc5-kube-api-access-k4m7z\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.124757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-config\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.125742 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553178-56m2v"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.126193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.127032 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.127787 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.128759 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-serving-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.126223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wdr\" (UniqueName: \"kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.129377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.129869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.129927 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpmz\" (UniqueName: \"kubernetes.io/projected/24ac204b-1627-404f-b33c-fc77ded356d1-kube-api-access-5fpmz\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.129953 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130014 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0359d6fc-1139-4dbc-a50f-55fa91607935-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130037 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-encryption-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130056 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130100 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-images\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130139 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wkr\" (UniqueName: \"kubernetes.io/projected/51238090-1fbe-446b-a63d-5ec9c3137c61-kube-api-access-k5wkr\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-auth-proxy-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-image-import-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-serving-cert\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130224 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130277 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130296 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5qq\" (UniqueName: \"kubernetes.io/projected/c284cbf1-b5e2-4f77-b14c-c0030f140a91-kube-api-access-qg5qq\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130371 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130749 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c284cbf1-b5e2-4f77-b14c-c0030f140a91-serving-cert\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130807 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130828 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-audit\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130878 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-serving-cert\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130925 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-node-pullsecrets\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130946 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51238090-1fbe-446b-a63d-5ec9c3137c61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.131009 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.131033 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.131064 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.131086 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zqb\" (UniqueName: \"kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.131105 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132634 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dac64e-6219-46fe-80b0-420098bb260b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51238090-1fbe-446b-a63d-5ec9c3137c61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132694 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0359d6fc-1139-4dbc-a50f-55fa91607935-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132742 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7nl\" (UniqueName: \"kubernetes.io/projected/7d1c92dd-43a7-4311-90b1-54441f84787e-kube-api-access-lj7nl\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132766 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.132817 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-audit-dir\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.133121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.130201 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.133539 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.133672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.133725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-audit-dir\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.134447 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-image-import-ca\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.134571 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.134681 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-encryption-config\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.129653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135102 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135314 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135316 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bd67414-ec33-4963-b5d8-ac374bd28a6a-auth-proxy-config\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135451 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.135949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/24ac204b-1627-404f-b33c-fc77ded356d1-audit\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.136356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/24ac204b-1627-404f-b33c-fc77ded356d1-node-pullsecrets\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.136449 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51238090-1fbe-446b-a63d-5ec9c3137c61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.136483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.136914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.137200 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.137378 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138065 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n8wns"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138391 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138413 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138436 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138459 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138952 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-h2stx"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.138983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51238090-1fbe-446b-a63d-5ec9c3137c61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.139455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc45881-56ff-4010-8eda-103f41f90bc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.140468 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-286h7"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.140608 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.140738 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.141594 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.142585 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-serving-cert\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.142655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.143623 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29553120-6zw87"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.144618 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.147651 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47244264-b9e7-4f86-85d7-5406ed8d8833-metrics-tls\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.148524 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.149426 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wm2rt"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.149855 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24ac204b-1627-404f-b33c-fc77ded356d1-etcd-client\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.150035 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.150068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.150904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7bd67414-ec33-4963-b5d8-ac374bd28a6a-machine-approver-tls\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.151093 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.152218 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.152594 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.154200 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txbh2"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.154621 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.157479 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.162248 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.166702 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.166757 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85z54"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.166770 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.170708 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.172447 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.175059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.176805 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.178314 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.180631 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.184393 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ptbbd"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.185825 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lws9c"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.187874 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.189412 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.190837 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.191367 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t54km"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.192807 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.193995 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.195053 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4jlsm"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.196177 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.197337 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.198994 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rq8sg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.199923 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.200194 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rmzp"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.202678 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.202800 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.203095 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.204578 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553178-56m2v"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.205259 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n8wns"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.206702 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rq8sg"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.207290 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.208749 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.209775 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rmzp"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.210251 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.211303 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.211910 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553176-m7tmh"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.213204 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kx277"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.213953 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7qgjq"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.214881 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqrc"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.215642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.215892 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqrc"] Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.230667 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233826 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233875 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc8a04b-d619-4e9a-b2b0-f08250f329e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc8kp\" (UniqueName: \"kubernetes.io/projected/84d53914-9806-4bc3-80ed-19cd8ff6e625-kube-api-access-lc8kp\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233924 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233944 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeb963e8-d683-4564-9ccd-d26a6a755e94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc8a04b-d619-4e9a-b2b0-f08250f329e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.233995 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a93df7-e24f-4681-8336-ef07295f1d09-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234017 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zqb\" (UniqueName: \"kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234051 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b35ce65-26df-4169-a69a-a06c2420da9b-config\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234072 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-policies\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234111 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234129 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kckr\" (UniqueName: \"kubernetes.io/projected/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-kube-api-access-8kckr\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66fb3c39-3d00-453f-a282-a04584652a8b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234171 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9njb\" (UniqueName: \"kubernetes.io/projected/de688bff-78ce-4d0f-ad7e-548ca640887a-kube-api-access-v9njb\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234188 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9jp\" (UniqueName: \"kubernetes.io/projected/c2a93df7-e24f-4681-8336-ef07295f1d09-kube-api-access-rd9jp\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234304 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-trusted-ca\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234445 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5r4x\" (UniqueName: \"kubernetes.io/projected/a8e11ddd-81e3-40f9-8ada-12abfacedca9-kube-api-access-z5r4x\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234505 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234570 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dp2v\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-kube-api-access-6dp2v\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234780 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc8a04b-d619-4e9a-b2b0-f08250f329e0-config\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234837 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-client\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234864 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-config\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234893 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-service-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkl5n\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-kube-api-access-jkl5n\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234971 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-service-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.234997 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-webhook-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235051 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235108 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d1c92dd-43a7-4311-90b1-54441f84787e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235157 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-encryption-config\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235180 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvwt\" (UniqueName: \"kubernetes.io/projected/66fb3c39-3d00-453f-a282-a04584652a8b-kube-api-access-brvwt\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0359d6fc-1139-4dbc-a50f-55fa91607935-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235232 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchkc\" (UniqueName: \"kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc\") pod \"auto-csr-approver-29553176-m7tmh\" (UID: \"7c454621-190e-4962-abed-72c0ec0613de\") " pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235262 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlvcs\" (UniqueName: \"kubernetes.io/projected/940c3a71-e220-417d-8ac4-cc70a4a5afae-kube-api-access-tlvcs\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235440 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/940c3a71-e220-417d-8ac4-cc70a4a5afae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235531 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c284cbf1-b5e2-4f77-b14c-c0030f140a91-serving-cert\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235694 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp799\" (UniqueName: \"kubernetes.io/projected/48b9bd28-7713-4475-92f0-b7b741e6337e-kube-api-access-xp799\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235760 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c2a93df7-e24f-4681-8336-ef07295f1d09-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235799 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235847 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-images\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235904 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px76q\" (UniqueName: \"kubernetes.io/projected/eeb963e8-d683-4564-9ccd-d26a6a755e94-kube-api-access-px76q\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235903 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-service-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.235969 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236083 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-dir\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236160 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dac64e-6219-46fe-80b0-420098bb260b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236212 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-service-ca\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236389 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0359d6fc-1139-4dbc-a50f-55fa91607935-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7nl\" (UniqueName: \"kubernetes.io/projected/7d1c92dd-43a7-4311-90b1-54441f84787e-kube-api-access-lj7nl\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236498 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236551 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5d8q\" (UniqueName: \"kubernetes.io/projected/309dded2-bab3-4166-8342-11d3ded619dc-kube-api-access-d5d8q\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236575 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-client\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236600 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgksc\" (UniqueName: \"kubernetes.io/projected/bef5e0c2-5b78-4f32-affd-aec245c27db1-kube-api-access-kgksc\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whhv\" (UniqueName: \"kubernetes.io/projected/61dac64e-6219-46fe-80b0-420098bb260b-kube-api-access-8whhv\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236707 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236734 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-config\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-trusted-ca\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236888 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.236906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-serving-cert\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeb963e8-d683-4564-9ccd-d26a6a755e94-proxy-tls\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237064 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ba3d981c-13e1-49dc-80db-4081ca811778-tmpfs\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-serving-cert\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dac64e-6219-46fe-80b0-420098bb260b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237265 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b35ce65-26df-4169-a69a-a06c2420da9b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-metrics-tls\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237316 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237881 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk88x\" (UniqueName: \"kubernetes.io/projected/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-kube-api-access-rk88x\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-config\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0359d6fc-1139-4dbc-a50f-55fa91607935-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237982 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dxjw\" (UniqueName: \"kubernetes.io/projected/5f720e42-33ed-4144-88d6-5fb8c4befac2-kube-api-access-6dxjw\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-config\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.237982 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b35ce65-26df-4169-a69a-a06c2420da9b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e11ddd-81e3-40f9-8ada-12abfacedca9-proxy-tls\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238137 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wdr\" (UniqueName: \"kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238042 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0359d6fc-1139-4dbc-a50f-55fa91607935-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238158 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npt42\" (UniqueName: \"kubernetes.io/projected/ba3d981c-13e1-49dc-80db-4081ca811778-kube-api-access-npt42\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238112 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-config\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzv6s\" (UniqueName: \"kubernetes.io/projected/6fafcf3b-e706-47a8-9501-beb09a76d0bf-kube-api-access-pzv6s\") pod \"migrator-59844c95c7-wcmgb\" (UID: \"6fafcf3b-e706-47a8-9501-beb09a76d0bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-images\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-serving-cert\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238760 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5qq\" (UniqueName: \"kubernetes.io/projected/c284cbf1-b5e2-4f77-b14c-c0030f140a91-kube-api-access-qg5qq\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238856 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.238891 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.239295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.239371 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c284cbf1-b5e2-4f77-b14c-c0030f140a91-trusted-ca\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.239488 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.239677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de688bff-78ce-4d0f-ad7e-548ca640887a-config\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.239811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dac64e-6219-46fe-80b0-420098bb260b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240118 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-config\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-etcd-client\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240463 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240715 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dac64e-6219-46fe-80b0-420098bb260b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240730 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d1c92dd-43a7-4311-90b1-54441f84787e-images\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.240961 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.241255 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0359d6fc-1139-4dbc-a50f-55fa91607935-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.241478 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c284cbf1-b5e2-4f77-b14c-c0030f140a91-serving-cert\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.241715 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d1c92dd-43a7-4311-90b1-54441f84787e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.241949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.242928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-config\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.242982 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.243527 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-serving-cert\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.243752 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de688bff-78ce-4d0f-ad7e-548ca640887a-serving-cert\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.257690 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.270592 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.290031 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.310344 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.331109 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340125 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-encryption-config\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340162 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brvwt\" (UniqueName: \"kubernetes.io/projected/66fb3c39-3d00-453f-a282-a04584652a8b-kube-api-access-brvwt\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340197 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchkc\" (UniqueName: \"kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc\") pod \"auto-csr-approver-29553176-m7tmh\" (UID: \"7c454621-190e-4962-abed-72c0ec0613de\") " pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340226 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlvcs\" (UniqueName: \"kubernetes.io/projected/940c3a71-e220-417d-8ac4-cc70a4a5afae-kube-api-access-tlvcs\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/940c3a71-e220-417d-8ac4-cc70a4a5afae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp799\" (UniqueName: \"kubernetes.io/projected/48b9bd28-7713-4475-92f0-b7b741e6337e-kube-api-access-xp799\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340286 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c2a93df7-e24f-4681-8336-ef07295f1d09-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340325 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-images\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px76q\" (UniqueName: \"kubernetes.io/projected/eeb963e8-d683-4564-9ccd-d26a6a755e94-kube-api-access-px76q\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.340380 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-dir\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344088 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c2a93df7-e24f-4681-8336-ef07295f1d09-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344294 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-dir\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-client\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344362 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5d8q\" (UniqueName: \"kubernetes.io/projected/309dded2-bab3-4166-8342-11d3ded619dc-kube-api-access-d5d8q\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344417 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgksc\" (UniqueName: \"kubernetes.io/projected/bef5e0c2-5b78-4f32-affd-aec245c27db1-kube-api-access-kgksc\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-serving-cert\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344556 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeb963e8-d683-4564-9ccd-d26a6a755e94-proxy-tls\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344576 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ba3d981c-13e1-49dc-80db-4081ca811778-tmpfs\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344615 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b35ce65-26df-4169-a69a-a06c2420da9b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344635 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-metrics-tls\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344658 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dxjw\" (UniqueName: \"kubernetes.io/projected/5f720e42-33ed-4144-88d6-5fb8c4befac2-kube-api-access-6dxjw\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344821 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b35ce65-26df-4169-a69a-a06c2420da9b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344845 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e11ddd-81e3-40f9-8ada-12abfacedca9-proxy-tls\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344875 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npt42\" (UniqueName: \"kubernetes.io/projected/ba3d981c-13e1-49dc-80db-4081ca811778-kube-api-access-npt42\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzv6s\" (UniqueName: \"kubernetes.io/projected/6fafcf3b-e706-47a8-9501-beb09a76d0bf-kube-api-access-pzv6s\") pod \"migrator-59844c95c7-wcmgb\" (UID: \"6fafcf3b-e706-47a8-9501-beb09a76d0bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344961 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.344995 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc8a04b-d619-4e9a-b2b0-f08250f329e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345045 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8kp\" (UniqueName: \"kubernetes.io/projected/84d53914-9806-4bc3-80ed-19cd8ff6e625-kube-api-access-lc8kp\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345070 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeb963e8-d683-4564-9ccd-d26a6a755e94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345091 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a93df7-e24f-4681-8336-ef07295f1d09-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc8a04b-d619-4e9a-b2b0-f08250f329e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b35ce65-26df-4169-a69a-a06c2420da9b-config\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345170 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-policies\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345192 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345212 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kckr\" (UniqueName: \"kubernetes.io/projected/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-kube-api-access-8kckr\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66fb3c39-3d00-453f-a282-a04584652a8b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345262 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345287 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9jp\" (UniqueName: \"kubernetes.io/projected/c2a93df7-e24f-4681-8336-ef07295f1d09-kube-api-access-rd9jp\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345317 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-trusted-ca\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345339 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5r4x\" (UniqueName: \"kubernetes.io/projected/a8e11ddd-81e3-40f9-8ada-12abfacedca9-kube-api-access-z5r4x\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345365 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345369 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc8a04b-d619-4e9a-b2b0-f08250f329e0-config\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkl5n\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-kube-api-access-jkl5n\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-webhook-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.345651 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.346149 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.347121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ba3d981c-13e1-49dc-80db-4081ca811778-tmpfs\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.348274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.349451 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeb963e8-d683-4564-9ccd-d26a6a755e94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.349722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f720e42-33ed-4144-88d6-5fb8c4befac2-audit-policies\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.351692 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.351799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-etcd-client\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.358481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-encryption-config\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.360716 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f720e42-33ed-4144-88d6-5fb8c4befac2-serving-cert\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.371187 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.384005 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a93df7-e24f-4681-8336-ef07295f1d09-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.390467 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.411370 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.430972 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.451498 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.472550 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.482230 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeb963e8-d683-4564-9ccd-d26a6a755e94-proxy-tls\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.491695 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.511484 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.540410 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.552421 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.570308 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.590272 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.610306 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.633055 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.650184 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.662230 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc8a04b-d619-4e9a-b2b0-f08250f329e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.670768 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.677392 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc8a04b-d619-4e9a-b2b0-f08250f329e0-config\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.691117 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.711947 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.722822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e11ddd-81e3-40f9-8ada-12abfacedca9-proxy-tls\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.731138 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.732232 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a8e11ddd-81e3-40f9-8ada-12abfacedca9-images\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.750701 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.770980 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.790828 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.801737 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b35ce65-26df-4169-a69a-a06c2420da9b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.810946 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.820242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b35ce65-26df-4169-a69a-a06c2420da9b-config\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.831582 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.846886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/940c3a71-e220-417d-8ac4-cc70a4a5afae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.850760 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.871844 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.892573 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.911919 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.932639 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.952031 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.970740 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 00:58:05 crc kubenswrapper[4744]: I0311 00:58:05.991536 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.011151 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.031589 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.051833 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.071989 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.090647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.111111 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.121968 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-webhook-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.123111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba3d981c-13e1-49dc-80db-4081ca811778-apiservice-cert\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.128906 4744 request.go:700] Waited for 1.015844799s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dservice-ca-bundle&limit=500&resourceVersion=0 Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.130850 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.151850 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.172621 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.191727 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.206697 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66fb3c39-3d00-453f-a282-a04584652a8b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.212279 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.225347 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-metrics-tls\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.232047 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.263056 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.271105 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.271271 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-trusted-ca\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.291333 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.311154 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.331365 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.341211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.344199 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346419 4744 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346473 4744 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346577 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key podName:84d53914-9806-4bc3-80ed-19cd8ff6e625 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.84654207 +0000 UTC m=+243.650759725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key") pod "service-ca-9c57cc56f-kx277" (UID: "84d53914-9806-4bc3-80ed-19cd8ff6e625") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346728 4744 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346802 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config podName:309dded2-bab3-4166-8342-11d3ded619dc nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.846770097 +0000 UTC m=+243.650987732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config") pod "service-ca-operator-777779d784-n8wns" (UID: "309dded2-bab3-4166-8342-11d3ded619dc") : failed to sync configmap cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346845 4744 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346888 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert podName:309dded2-bab3-4166-8342-11d3ded619dc nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.84687493 +0000 UTC m=+243.651092575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert") pod "service-ca-operator-777779d784-n8wns" (UID: "309dded2-bab3-4166-8342-11d3ded619dc") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.346941 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert podName:48b9bd28-7713-4475-92f0-b7b741e6337e nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.846925732 +0000 UTC m=+243.651143377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert") pod "olm-operator-6b444d44fb-slpdp" (UID: "48b9bd28-7713-4475-92f0-b7b741e6337e") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.348742 4744 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.348835 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle podName:84d53914-9806-4bc3-80ed-19cd8ff6e625 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.848808709 +0000 UTC m=+243.653026354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle") pod "service-ca-9c57cc56f-kx277" (UID: "84d53914-9806-4bc3-80ed-19cd8ff6e625") : failed to sync configmap cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.348873 4744 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.348973 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs podName:ca7c15de-4f71-45fd-b1b0-4c451ce9724e nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.848956024 +0000 UTC m=+243.653173739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs") pod "machine-config-server-h2stx" (UID: "ca7c15de-4f71-45fd-b1b0-4c451ce9724e") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.351330 4744 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.351401 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert podName:bef5e0c2-5b78-4f32-affd-aec245c27db1 nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.851376968 +0000 UTC m=+243.655594603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert") pod "catalog-operator-68c6474976-fhbhq" (UID: "bef5e0c2-5b78-4f32-affd-aec245c27db1") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.351451 4744 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: E0311 00:58:06.351506 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token podName:ca7c15de-4f71-45fd-b1b0-4c451ce9724e nodeName:}" failed. No retries permitted until 2026-03-11 00:58:06.851493462 +0000 UTC m=+243.655711097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token") pod "machine-config-server-h2stx" (UID: "ca7c15de-4f71-45fd-b1b0-4c451ce9724e") : failed to sync secret cache: timed out waiting for the condition Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.352270 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.371282 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.391307 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.431414 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.432977 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmsqr\" (UniqueName: \"kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr\") pod \"console-f9d7485db-msd9d\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.460223 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.472074 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.490563 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.510872 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.531258 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.551990 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.584034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.619916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjgb\" (UniqueName: \"kubernetes.io/projected/7bd67414-ec33-4963-b5d8-ac374bd28a6a-kube-api-access-tdjgb\") pod \"machine-approver-56656f9798-btvmr\" (UID: \"7bd67414-ec33-4963-b5d8-ac374bd28a6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.639851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkf84\" (UniqueName: \"kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84\") pod \"oauth-openshift-558db77b4-4tcts\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.711757 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.730847 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.751425 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 00:58:06 crc kubenswrapper[4744]: I0311 00:58:06.911958 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.229795 4744 request.go:700] Waited for 1.995516594s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.242986 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243094 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243190 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.243586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.249043 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.251664 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.255345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-cabundle\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.261325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fz4\" (UniqueName: \"kubernetes.io/projected/47244264-b9e7-4f86-85d7-5406ed8d8833-kube-api-access-h4fz4\") pod \"dns-operator-744455d44c-4jlsm\" (UID: \"47244264-b9e7-4f86-85d7-5406ed8d8833\") " pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.265546 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.265893 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.266155 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.266402 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.268054 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.268362 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.268600 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.268833 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.268990 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269067 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269255 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269426 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269638 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269746 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269013 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.269869 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.270067 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.270208 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.270277 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.271279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d53914-9806-4bc3-80ed-19cd8ff6e625-signing-key\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.276286 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309dded2-bab3-4166-8342-11d3ded619dc-config\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.277317 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48b9bd28-7713-4475-92f0-b7b741e6337e-srv-cert\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.283061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-node-bootstrap-token\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.283432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-certs\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.286905 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.288309 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309dded2-bab3-4166-8342-11d3ded619dc-serving-cert\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.290500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wkr\" (UniqueName: \"kubernetes.io/projected/51238090-1fbe-446b-a63d-5ec9c3137c61-kube-api-access-k5wkr\") pod \"openshift-controller-manager-operator-756b6f6bc6-xfdgm\" (UID: \"51238090-1fbe-446b-a63d-5ec9c3137c61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.292080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4m7z\" (UniqueName: \"kubernetes.io/projected/acc45881-56ff-4010-8eda-103f41f90bc5-kube-api-access-k4m7z\") pod \"cluster-samples-operator-665b6dd947-pn84h\" (UID: \"acc45881-56ff-4010-8eda-103f41f90bc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.293110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zqb\" (UniqueName: \"kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb\") pod \"controller-manager-879f6c89f-9gtcq\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.294739 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpmz\" (UniqueName: \"kubernetes.io/projected/24ac204b-1627-404f-b33c-fc77ded356d1-kube-api-access-5fpmz\") pod \"apiserver-76f77b778f-txbh2\" (UID: \"24ac204b-1627-404f-b33c-fc77ded356d1\") " pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.297124 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bef5e0c2-5b78-4f32-affd-aec245c27db1-srv-cert\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.306049 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzxz\" (UniqueName: \"kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz\") pod \"image-pruner-29553120-6zw87\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.309981 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9njb\" (UniqueName: \"kubernetes.io/projected/de688bff-78ce-4d0f-ad7e-548ca640887a-kube-api-access-v9njb\") pod \"etcd-operator-b45778765-wm2rt\" (UID: \"de688bff-78ce-4d0f-ad7e-548ca640887a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.316351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dp2v\" (UniqueName: \"kubernetes.io/projected/4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d-kube-api-access-6dp2v\") pod \"cluster-image-registry-operator-dc59b4c8b-74tmz\" (UID: \"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:07 crc kubenswrapper[4744]: W0311 00:58:07.328798 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd67414_ec33_4963_b5d8_ac374bd28a6a.slice/crio-55ac7d43a3c4a7da7e20c7d540d6e013e9e414a15aaa88ddb5522479c5712ac5 WatchSource:0}: Error finding container 55ac7d43a3c4a7da7e20c7d540d6e013e9e414a15aaa88ddb5522479c5712ac5: Status 404 returned error can't find the container with id 55ac7d43a3c4a7da7e20c7d540d6e013e9e414a15aaa88ddb5522479c5712ac5 Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.337923 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.346489 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0359d6fc-1139-4dbc-a50f-55fa91607935-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-chlpc\" (UID: \"0359d6fc-1139-4dbc-a50f-55fa91607935\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.367645 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" event={"ID":"7bd67414-ec33-4963-b5d8-ac374bd28a6a","Type":"ContainerStarted","Data":"55ac7d43a3c4a7da7e20c7d540d6e013e9e414a15aaa88ddb5522479c5712ac5"} Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.368261 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whhv\" (UniqueName: \"kubernetes.io/projected/61dac64e-6219-46fe-80b0-420098bb260b-kube-api-access-8whhv\") pod \"openshift-apiserver-operator-796bbdcf4f-l2fh5\" (UID: \"61dac64e-6219-46fe-80b0-420098bb260b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.378016 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7nl\" (UniqueName: \"kubernetes.io/projected/7d1c92dd-43a7-4311-90b1-54441f84787e-kube-api-access-lj7nl\") pod \"machine-api-operator-5694c8668f-lws9c\" (UID: \"7d1c92dd-43a7-4311-90b1-54441f84787e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.396192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk88x\" (UniqueName: \"kubernetes.io/projected/ab6c95e5-e2c0-4cb3-a647-930b4c4d2927-kube-api-access-rk88x\") pod \"authentication-operator-69f744f599-286h7\" (UID: \"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.415012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wdr\" (UniqueName: \"kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr\") pod \"route-controller-manager-6576b87f9c-tqzmz\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.424533 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.437231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5qq\" (UniqueName: \"kubernetes.io/projected/c284cbf1-b5e2-4f77-b14c-c0030f140a91-kube-api-access-qg5qq\") pod \"console-operator-58897d9998-ptbbd\" (UID: \"c284cbf1-b5e2-4f77-b14c-c0030f140a91\") " pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.441709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.446970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp799\" (UniqueName: \"kubernetes.io/projected/48b9bd28-7713-4475-92f0-b7b741e6337e-kube-api-access-xp799\") pod \"olm-operator-6b444d44fb-slpdp\" (UID: \"48b9bd28-7713-4475-92f0-b7b741e6337e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.450730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.462054 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.469141 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brvwt\" (UniqueName: \"kubernetes.io/projected/66fb3c39-3d00-453f-a282-a04584652a8b-kube-api-access-brvwt\") pod \"multus-admission-controller-857f4d67dd-85z54\" (UID: \"66fb3c39-3d00-453f-a282-a04584652a8b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.469293 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 00:58:07 crc kubenswrapper[4744]: W0311 00:58:07.488017 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d55c2f3_2eef_42eb_8627_ccd40e21f4d0.slice/crio-5b1437991fe6641c1aee417825e707901e69a59a9c2e19ee5e372209c97a5570 WatchSource:0}: Error finding container 5b1437991fe6641c1aee417825e707901e69a59a9c2e19ee5e372209c97a5570: Status 404 returned error can't find the container with id 5b1437991fe6641c1aee417825e707901e69a59a9c2e19ee5e372209c97a5570 Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.488632 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchkc\" (UniqueName: \"kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc\") pod \"auto-csr-approver-29553176-m7tmh\" (UID: \"7c454621-190e-4962-abed-72c0ec0613de\") " pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.503941 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.510872 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.520191 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.520354 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px76q\" (UniqueName: \"kubernetes.io/projected/eeb963e8-d683-4564-9ccd-d26a6a755e94-kube-api-access-px76q\") pod \"machine-config-controller-84d6567774-94dsr\" (UID: \"eeb963e8-d683-4564-9ccd-d26a6a755e94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.533602 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.546772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlvcs\" (UniqueName: \"kubernetes.io/projected/940c3a71-e220-417d-8ac4-cc70a4a5afae-kube-api-access-tlvcs\") pod \"package-server-manager-789f6589d5-56d4r\" (UID: \"940c3a71-e220-417d-8ac4-cc70a4a5afae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.548811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5d8q\" (UniqueName: \"kubernetes.io/projected/309dded2-bab3-4166-8342-11d3ded619dc-kube-api-access-d5d8q\") pod \"service-ca-operator-777779d784-n8wns\" (UID: \"309dded2-bab3-4166-8342-11d3ded619dc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.553901 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.556176 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.566889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgksc\" (UniqueName: \"kubernetes.io/projected/bef5e0c2-5b78-4f32-affd-aec245c27db1-kube-api-access-kgksc\") pod \"catalog-operator-68c6474976-fhbhq\" (UID: \"bef5e0c2-5b78-4f32-affd-aec245c27db1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.584588 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.588042 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkl5n\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-kube-api-access-jkl5n\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.592458 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.596416 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:07 crc kubenswrapper[4744]: W0311 00:58:07.599665 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a61e39_2210_4bb1_96c9_509eda04c4c7.slice/crio-509584436b00c9a81ac31a7f4f18c1243a6264d7856713053e14130a82fe9cb5 WatchSource:0}: Error finding container 509584436b00c9a81ac31a7f4f18c1243a6264d7856713053e14130a82fe9cb5: Status 404 returned error can't find the container with id 509584436b00c9a81ac31a7f4f18c1243a6264d7856713053e14130a82fe9cb5 Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.602997 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.609674 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.610100 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npt42\" (UniqueName: \"kubernetes.io/projected/ba3d981c-13e1-49dc-80db-4081ca811778-kube-api-access-npt42\") pod \"packageserver-d55dfcdfc-kgl9z\" (UID: \"ba3d981c-13e1-49dc-80db-4081ca811778\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.615981 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.621931 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.636192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b35ce65-26df-4169-a69a-a06c2420da9b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gnfcr\" (UID: \"9b35ce65-26df-4169-a69a-a06c2420da9b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.650583 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzv6s\" (UniqueName: \"kubernetes.io/projected/6fafcf3b-e706-47a8-9501-beb09a76d0bf-kube-api-access-pzv6s\") pod \"migrator-59844c95c7-wcmgb\" (UID: \"6fafcf3b-e706-47a8-9501-beb09a76d0bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.663799 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.671783 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.675830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fb8ca9a-b3e7-4fce-b173-f8b2519962da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t54km\" (UID: \"7fb8ca9a-b3e7-4fce-b173-f8b2519962da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.694565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9jp\" (UniqueName: \"kubernetes.io/projected/c2a93df7-e24f-4681-8336-ef07295f1d09-kube-api-access-rd9jp\") pod \"openshift-config-operator-7777fb866f-hf5rf\" (UID: \"c2a93df7-e24f-4681-8336-ef07295f1d09\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.695884 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29553120-6zw87"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.699926 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.705599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.709625 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kckr\" (UniqueName: \"kubernetes.io/projected/ca7c15de-4f71-45fd-b1b0-4c451ce9724e-kube-api-access-8kckr\") pod \"machine-config-server-h2stx\" (UID: \"ca7c15de-4f71-45fd-b1b0-4c451ce9724e\") " pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.728827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc8kp\" (UniqueName: \"kubernetes.io/projected/84d53914-9806-4bc3-80ed-19cd8ff6e625-kube-api-access-lc8kp\") pod \"service-ca-9c57cc56f-kx277\" (UID: \"84d53914-9806-4bc3-80ed-19cd8ff6e625\") " pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.728903 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.734728 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.735259 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4jlsm"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.746921 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5r4x\" (UniqueName: \"kubernetes.io/projected/a8e11ddd-81e3-40f9-8ada-12abfacedca9-kube-api-access-z5r4x\") pod \"machine-config-operator-74547568cd-v4b66\" (UID: \"a8e11ddd-81e3-40f9-8ada-12abfacedca9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.762061 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.766346 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc8a04b-d619-4e9a-b2b0-f08250f329e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bbjh8\" (UID: \"acc8a04b-d619-4e9a-b2b0-f08250f329e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.797550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dxjw\" (UniqueName: \"kubernetes.io/projected/5f720e42-33ed-4144-88d6-5fb8c4befac2-kube-api-access-6dxjw\") pod \"apiserver-7bbb656c7d-h49lh\" (UID: \"5f720e42-33ed-4144-88d6-5fb8c4befac2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.806852 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kx277" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.818793 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.853746 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txbh2"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.854173 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.862349 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.868610 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h2stx" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.870898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-default-certificate\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.870932 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.870959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.870980 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csqx7\" (UniqueName: \"kubernetes.io/projected/9b886295-15e1-4478-9ecb-ab71e77b99eb-kube-api-access-csqx7\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871131 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871149 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871165 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca02643-7e55-4bab-be3a-781a8017f11b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871189 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-metrics-certs\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b886295-15e1-4478-9ecb-ab71e77b99eb-service-ca-bundle\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871262 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2ck\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-stats-auth\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871335 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871366 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871385 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871402 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxn94\" (UniqueName: \"kubernetes.io/projected/3ca02643-7e55-4bab-be3a-781a8017f11b-kube-api-access-zxn94\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca02643-7e55-4bab-be3a-781a8017f11b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcql9\" (UniqueName: \"kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9\") pod \"auto-csr-approver-29553178-56m2v\" (UID: \"95897da0-81a7-4656-9787-808f64d7aa9d\") " pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871470 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gj2s\" (UniqueName: \"kubernetes.io/projected/ea1125a6-231a-4f21-b9d6-8cdb2a51e482-kube-api-access-4gj2s\") pod \"downloads-7954f5f757-7qgjq\" (UID: \"ea1125a6-231a-4f21-b9d6-8cdb2a51e482\") " pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871527 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97n2\" (UniqueName: \"kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.871601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zvc\" (UniqueName: \"kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: E0311 00:58:07.871800 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.371786493 +0000 UTC m=+245.176004098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.916415 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:07 crc kubenswrapper[4744]: W0311 00:58:07.927010 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b9bd28_7713_4475_92f0_b7b741e6337e.slice/crio-0d1188be210a0c19b27596bb3880f2f77a48fab61f78841be44b36454c6e0988 WatchSource:0}: Error finding container 0d1188be210a0c19b27596bb3880f2f77a48fab61f78841be44b36454c6e0988: Status 404 returned error can't find the container with id 0d1188be210a0c19b27596bb3880f2f77a48fab61f78841be44b36454c6e0988 Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.928584 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.937048 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.939033 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553176-m7tmh"] Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.974739 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.974920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.974965 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.974987 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975679 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxn94\" (UniqueName: \"kubernetes.io/projected/3ca02643-7e55-4bab-be3a-781a8017f11b-kube-api-access-zxn94\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975804 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca02643-7e55-4bab-be3a-781a8017f11b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975864 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcql9\" (UniqueName: \"kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9\") pod \"auto-csr-approver-29553178-56m2v\" (UID: \"95897da0-81a7-4656-9787-808f64d7aa9d\") " pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975902 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gj2s\" (UniqueName: \"kubernetes.io/projected/ea1125a6-231a-4f21-b9d6-8cdb2a51e482-kube-api-access-4gj2s\") pod \"downloads-7954f5f757-7qgjq\" (UID: \"ea1125a6-231a-4f21-b9d6-8cdb2a51e482\") " pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975923 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97n2\" (UniqueName: \"kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975947 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxk9\" (UniqueName: \"kubernetes.io/projected/969d9403-4eed-4c8b-b790-cae357bc60eb-kube-api-access-mzxk9\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.975979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zvc\" (UniqueName: \"kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976098 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-registration-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976133 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-default-certificate\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976155 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976237 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e1cff98-4344-4952-8d6c-d4f5c8d58628-config-volume\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-socket-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsq9\" (UniqueName: \"kubernetes.io/projected/1ab34621-2907-423a-81ef-36fb8377874d-kube-api-access-6xsq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976315 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx7\" (UniqueName: \"kubernetes.io/projected/9b886295-15e1-4478-9ecb-ab71e77b99eb-kube-api-access-csqx7\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976334 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-csi-data-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976426 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976469 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976523 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e1cff98-4344-4952-8d6c-d4f5c8d58628-metrics-tls\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976575 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca02643-7e55-4bab-be3a-781a8017f11b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976607 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ab34621-2907-423a-81ef-36fb8377874d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976644 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976712 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpgnm\" (UniqueName: \"kubernetes.io/projected/c14e937f-6acb-4f36-9432-eb77464ce9c9-kube-api-access-hpgnm\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-metrics-certs\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b886295-15e1-4478-9ecb-ab71e77b99eb-service-ca-bundle\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/969d9403-4eed-4c8b-b790-cae357bc60eb-cert\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcvk\" (UniqueName: \"kubernetes.io/projected/4e1cff98-4344-4952-8d6c-d4f5c8d58628-kube-api-access-7gcvk\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2ck\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.976980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-stats-auth\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.977103 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-mountpoint-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.977207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-plugins-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:07 crc kubenswrapper[4744]: E0311 00:58:07.978502 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.478471088 +0000 UTC m=+245.282688693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.979635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.980851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.981098 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca02643-7e55-4bab-be3a-781a8017f11b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.983402 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.985726 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.990142 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b886295-15e1-4478-9ecb-ab71e77b99eb-service-ca-bundle\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.990315 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.993239 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.995264 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.996947 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca02643-7e55-4bab-be3a-781a8017f11b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.997727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-default-certificate\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.997733 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.998021 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-stats-auth\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:07 crc kubenswrapper[4744]: I0311 00:58:07.998430 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.007105 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b886295-15e1-4478-9ecb-ab71e77b99eb-metrics-certs\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.007708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.008929 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.015803 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.032141 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcql9\" (UniqueName: \"kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9\") pod \"auto-csr-approver-29553178-56m2v\" (UID: \"95897da0-81a7-4656-9787-808f64d7aa9d\") " pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.037395 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wm2rt"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.046416 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gj2s\" (UniqueName: \"kubernetes.io/projected/ea1125a6-231a-4f21-b9d6-8cdb2a51e482-kube-api-access-4gj2s\") pod \"downloads-7954f5f757-7qgjq\" (UID: \"ea1125a6-231a-4f21-b9d6-8cdb2a51e482\") " pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.051983 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.066094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97n2\" (UniqueName: \"kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2\") pod \"marketplace-operator-79b997595-k6qjd\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079196 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpgnm\" (UniqueName: \"kubernetes.io/projected/c14e937f-6acb-4f36-9432-eb77464ce9c9-kube-api-access-hpgnm\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/969d9403-4eed-4c8b-b790-cae357bc60eb-cert\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079281 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcvk\" (UniqueName: \"kubernetes.io/projected/4e1cff98-4344-4952-8d6c-d4f5c8d58628-kube-api-access-7gcvk\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-mountpoint-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-plugins-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxk9\" (UniqueName: \"kubernetes.io/projected/969d9403-4eed-4c8b-b790-cae357bc60eb-kube-api-access-mzxk9\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-registration-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e1cff98-4344-4952-8d6c-d4f5c8d58628-config-volume\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.079496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsq9\" (UniqueName: \"kubernetes.io/projected/1ab34621-2907-423a-81ef-36fb8377874d-kube-api-access-6xsq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.079721 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca7c15de_4f71_45fd_b1b0_4c451ce9724e.slice/crio-b25ea2bb363976ba8798771c80ab8dbcfde9b5c7ea13bad7a646374a9cb85370 WatchSource:0}: Error finding container b25ea2bb363976ba8798771c80ab8dbcfde9b5c7ea13bad7a646374a9cb85370: Status 404 returned error can't find the container with id b25ea2bb363976ba8798771c80ab8dbcfde9b5c7ea13bad7a646374a9cb85370 Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.080066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-plugins-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.080460 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-socket-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.080497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-csi-data-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.080547 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e1cff98-4344-4952-8d6c-d4f5c8d58628-metrics-tls\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.080568 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ab34621-2907-423a-81ef-36fb8377874d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.082918 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-socket-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.083141 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-csi-data-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.083260 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e1cff98-4344-4952-8d6c-d4f5c8d58628-config-volume\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.083373 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/969d9403-4eed-4c8b-b790-cae357bc60eb-cert\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.083542 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-mountpoint-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.083556 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.583541433 +0000 UTC m=+245.387759038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.083605 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c14e937f-6acb-4f36-9432-eb77464ce9c9-registration-dir\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.084863 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ab34621-2907-423a-81ef-36fb8377874d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.087890 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e1cff98-4344-4952-8d6c-d4f5c8d58628-metrics-tls\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.088166 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zvc\" (UniqueName: \"kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc\") pod \"collect-profiles-29553165-4gpb9\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.108606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxn94\" (UniqueName: \"kubernetes.io/projected/3ca02643-7e55-4bab-be3a-781a8017f11b-kube-api-access-zxn94\") pod \"kube-storage-version-migrator-operator-b67b599dd-6k4hw\" (UID: \"3ca02643-7e55-4bab-be3a-781a8017f11b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.112170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.128152 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.128627 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2ck\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.181262 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.181702 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.681684055 +0000 UTC m=+245.485901650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.184342 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.202823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csqx7\" (UniqueName: \"kubernetes.io/projected/9b886295-15e1-4478-9ecb-ab71e77b99eb-kube-api-access-csqx7\") pod \"router-default-5444994796-9dcwq\" (UID: \"9b886295-15e1-4478-9ecb-ab71e77b99eb\") " pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.207274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpgnm\" (UniqueName: \"kubernetes.io/projected/c14e937f-6acb-4f36-9432-eb77464ce9c9-kube-api-access-hpgnm\") pod \"csi-hostpathplugin-4rmzp\" (UID: \"c14e937f-6acb-4f36-9432-eb77464ce9c9\") " pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.212492 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsq9\" (UniqueName: \"kubernetes.io/projected/1ab34621-2907-423a-81ef-36fb8377874d-kube-api-access-6xsq9\") pod \"control-plane-machine-set-operator-78cbb6b69f-f8rpg\" (UID: \"1ab34621-2907-423a-81ef-36fb8377874d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.224873 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcvk\" (UniqueName: \"kubernetes.io/projected/4e1cff98-4344-4952-8d6c-d4f5c8d58628-kube-api-access-7gcvk\") pod \"dns-default-rq8sg\" (UID: \"4e1cff98-4344-4952-8d6c-d4f5c8d58628\") " pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.250435 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxk9\" (UniqueName: \"kubernetes.io/projected/969d9403-4eed-4c8b-b790-cae357bc60eb-kube-api-access-mzxk9\") pod \"ingress-canary-6jqrc\" (UID: \"969d9403-4eed-4c8b-b790-cae357bc60eb\") " pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.278967 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.282645 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.283024 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.783005315 +0000 UTC m=+245.587222920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.312209 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.320165 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.383729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.384132 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.884105358 +0000 UTC m=+245.688322973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.469146 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-286h7"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.470669 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n8wns"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.473850 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ptbbd"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.476806 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.484557 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.485920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.486263 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:08.986247323 +0000 UTC m=+245.790464928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.490287 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.518854 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jqrc" Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.525527 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" event={"ID":"7bd67414-ec33-4963-b5d8-ac374bd28a6a","Type":"ContainerStarted","Data":"2670a4b9a6c1343a3b449ded9333d363072106db7610d0fa413e7849511d2023"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.527791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" event={"ID":"7c454621-190e-4962-abed-72c0ec0613de","Type":"ContainerStarted","Data":"48a9bc05b7bcbb0d30ebd3c256d0b20a51f77b6070ef776ebfcca242dcc7d7c1"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.530540 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" event={"ID":"47244264-b9e7-4f86-85d7-5406ed8d8833","Type":"ContainerStarted","Data":"4600ce4865eb073b678c22439ddbdc822e88e50575e8f79591699897e81a9a76"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.531938 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" event={"ID":"51238090-1fbe-446b-a63d-5ec9c3137c61","Type":"ContainerStarted","Data":"d1a912e88f338d10f2b12f00f65563a078632f251d49f40bd17001ab2944e4c5"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.533863 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h2stx" event={"ID":"ca7c15de-4f71-45fd-b1b0-4c451ce9724e","Type":"ContainerStarted","Data":"b25ea2bb363976ba8798771c80ab8dbcfde9b5c7ea13bad7a646374a9cb85370"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.536912 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" event={"ID":"74a61e39-2210-4bb1-96c9-509eda04c4c7","Type":"ContainerStarted","Data":"509584436b00c9a81ac31a7f4f18c1243a6264d7856713053e14130a82fe9cb5"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.540075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" event={"ID":"8495f637-031c-4280-be13-d5aae9c99eca","Type":"ContainerStarted","Data":"1dafcd2dab172541bc2474afc4c6eb93613d4c87fd732e0897b5697b82962078"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.542072 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29553120-6zw87" event={"ID":"8d83a64e-bd7c-43b4-aac4-8fdc807059f5","Type":"ContainerStarted","Data":"06aa8cf535683f95557ed9ec3a36658a240464fedb09c1198012477ffe68b408"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.547403 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" event={"ID":"24ac204b-1627-404f-b33c-fc77ded356d1","Type":"ContainerStarted","Data":"3695fba9ac643186c091f1b23340a3e254e364d6d8d44156e773ce8567d4d843"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.549063 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" event={"ID":"de688bff-78ce-4d0f-ad7e-548ca640887a","Type":"ContainerStarted","Data":"a4276ad5caa39be8a215683f54929320002f51302b06dceede14030f8a6ffbc0"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.552654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-msd9d" event={"ID":"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0","Type":"ContainerStarted","Data":"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.552691 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-msd9d" event={"ID":"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0","Type":"ContainerStarted","Data":"5b1437991fe6641c1aee417825e707901e69a59a9c2e19ee5e372209c97a5570"} Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.554723 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" event={"ID":"48b9bd28-7713-4475-92f0-b7b741e6337e","Type":"ContainerStarted","Data":"0d1188be210a0c19b27596bb3880f2f77a48fab61f78841be44b36454c6e0988"} Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.566774 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309dded2_bab3_4166_8342_11d3ded619dc.slice/crio-a8b8871f84969ed1ca63e108fb7aad8e33883bdaa446b86678f0747814092917 WatchSource:0}: Error finding container a8b8871f84969ed1ca63e108fb7aad8e33883bdaa446b86678f0747814092917: Status 404 returned error can't find the container with id a8b8871f84969ed1ca63e108fb7aad8e33883bdaa446b86678f0747814092917 Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.570645 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc284cbf1_b5e2_4f77_b14c_c0030f140a91.slice/crio-9c9a637a2eb1df65d12ccc55dcde13caf2ad9754cf05667c6177ef55153c56d6 WatchSource:0}: Error finding container 9c9a637a2eb1df65d12ccc55dcde13caf2ad9754cf05667c6177ef55153c56d6: Status 404 returned error can't find the container with id 9c9a637a2eb1df65d12ccc55dcde13caf2ad9754cf05667c6177ef55153c56d6 Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.587869 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.588679 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.088658756 +0000 UTC m=+245.892876361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.649079 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.659792 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.664682 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.680458 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.690234 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.691748 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.693236 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.193217855 +0000 UTC m=+245.997435450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.757266 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.764688 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85z54"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.774142 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.778725 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lws9c"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.795489 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.796921 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.296887947 +0000 UTC m=+246.101105552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.811779 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.811835 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.819129 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t54km"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.821240 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kx277"] Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.830338 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940c3a71_e220_417d_8ac4_cc70a4a5afae.slice/crio-fc252fa939962db437c7318dd18901324d0cf7ea9ba96c218c9ab80fbc080af7 WatchSource:0}: Error finding container fc252fa939962db437c7318dd18901324d0cf7ea9ba96c218c9ab80fbc080af7: Status 404 returned error can't find the container with id fc252fa939962db437c7318dd18901324d0cf7ea9ba96c218c9ab80fbc080af7 Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.849539 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1c92dd_43a7_4311_90b1_54441f84787e.slice/crio-8106632e392de14f3cda2e7814019c381851f25c89bd80e60b67dbadd522f855 WatchSource:0}: Error finding container 8106632e392de14f3cda2e7814019c381851f25c89bd80e60b67dbadd522f855: Status 404 returned error can't find the container with id 8106632e392de14f3cda2e7814019c381851f25c89bd80e60b67dbadd522f855 Mar 11 00:58:08 crc kubenswrapper[4744]: W0311 00:58:08.858506 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0359d6fc_1139_4dbc_a50f_55fa91607935.slice/crio-7c9d96f27b182f5b122e14c6dd843a13e97f6ac65f06057304c5073b18a67bff WatchSource:0}: Error finding container 7c9d96f27b182f5b122e14c6dd843a13e97f6ac65f06057304c5073b18a67bff: Status 404 returned error can't find the container with id 7c9d96f27b182f5b122e14c6dd843a13e97f6ac65f06057304c5073b18a67bff Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.897890 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:08 crc kubenswrapper[4744]: E0311 00:58:08.899353 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.399311522 +0000 UTC m=+246.203529137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.941673 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.952975 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66"] Mar 11 00:58:08 crc kubenswrapper[4744]: I0311 00:58:08.957997 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:08.995492 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553178-56m2v"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.003136 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.014987 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.514941842 +0000 UTC m=+246.319159447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.071665 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq"] Mar 11 00:58:09 crc kubenswrapper[4744]: W0311 00:58:09.079187 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f720e42_33ed_4144_88d6_5fb8c4befac2.slice/crio-13b5010875155e1413fffef7441a5a2f641fdc6014d4d4fce329dd388c842992 WatchSource:0}: Error finding container 13b5010875155e1413fffef7441a5a2f641fdc6014d4d4fce329dd388c842992: Status 404 returned error can't find the container with id 13b5010875155e1413fffef7441a5a2f641fdc6014d4d4fce329dd388c842992 Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.084326 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7qgjq"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.089125 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.091751 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.095152 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.097160 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.105456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.105926 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.605913943 +0000 UTC m=+246.410131548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.144793 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rq8sg"] Mar 11 00:58:09 crc kubenswrapper[4744]: W0311 00:58:09.188281 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1125a6_231a_4f21_b9d6_8cdb2a51e482.slice/crio-6b9947bc7ded01bd30466398a5cb27d131b50bef8d51c2dc378008d5d5c54bfa WatchSource:0}: Error finding container 6b9947bc7ded01bd30466398a5cb27d131b50bef8d51c2dc378008d5d5c54bfa: Status 404 returned error can't find the container with id 6b9947bc7ded01bd30466398a5cb27d131b50bef8d51c2dc378008d5d5c54bfa Mar 11 00:58:09 crc kubenswrapper[4744]: W0311 00:58:09.190409 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc8a04b_d619_4e9a_b2b0_f08250f329e0.slice/crio-e6f2871ae9248f53e8131d367f4b01336c9d352ea81dac33ee8df8f34f5cdb27 WatchSource:0}: Error finding container e6f2871ae9248f53e8131d367f4b01336c9d352ea81dac33ee8df8f34f5cdb27: Status 404 returned error can't find the container with id e6f2871ae9248f53e8131d367f4b01336c9d352ea81dac33ee8df8f34f5cdb27 Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.199724 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rmzp"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.206451 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.206988 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.706909183 +0000 UTC m=+246.511126788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.233629 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqrc"] Mar 11 00:58:09 crc kubenswrapper[4744]: W0311 00:58:09.241339 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a93df7_e24f_4681_8336_ef07295f1d09.slice/crio-130427030cfcde642f17617a2f5e8524383ee3fc1fc5470fa4f4804d841103ec WatchSource:0}: Error finding container 130427030cfcde642f17617a2f5e8524383ee3fc1fc5470fa4f4804d841103ec: Status 404 returned error can't find the container with id 130427030cfcde642f17617a2f5e8524383ee3fc1fc5470fa4f4804d841103ec Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.266928 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw"] Mar 11 00:58:09 crc kubenswrapper[4744]: W0311 00:58:09.291867 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14e937f_6acb_4f36_9432_eb77464ce9c9.slice/crio-ebf67c01653bd04950b609c0e16060df327abb0a30da02725d9ccd9e4cf56bdc WatchSource:0}: Error finding container ebf67c01653bd04950b609c0e16060df327abb0a30da02725d9ccd9e4cf56bdc: Status 404 returned error can't find the container with id ebf67c01653bd04950b609c0e16060df327abb0a30da02725d9ccd9e4cf56bdc Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.308685 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.309229 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.809173181 +0000 UTC m=+246.613390886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.395564 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.411172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.412282 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.912229335 +0000 UTC m=+246.716446940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.413703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.414108 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:09.914095622 +0000 UTC m=+246.718313227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.515442 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.515933 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.015914348 +0000 UTC m=+246.820131953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.596775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" event={"ID":"8495f637-031c-4280-be13-d5aae9c99eca","Type":"ContainerStarted","Data":"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.624110 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" event={"ID":"bef5e0c2-5b78-4f32-affd-aec245c27db1","Type":"ContainerStarted","Data":"142587b5881a6780204f78415ec39442180f446de725a832db03fd7ad906db45"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.624462 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.625025 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.125010297 +0000 UTC m=+246.929227902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.625913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" event={"ID":"7d1c92dd-43a7-4311-90b1-54441f84787e","Type":"ContainerStarted","Data":"b9db62e5fd4c948a9711c703043057e9ff0de5309ffe1bb06bc9bc581a049c60"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.625946 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" event={"ID":"7d1c92dd-43a7-4311-90b1-54441f84787e","Type":"ContainerStarted","Data":"8106632e392de14f3cda2e7814019c381851f25c89bd80e60b67dbadd522f855"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.630977 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553178-56m2v" event={"ID":"95897da0-81a7-4656-9787-808f64d7aa9d","Type":"ContainerStarted","Data":"e06843b6821d9669c3d6d83536b36f8e8c1958a0229a416209e722e09fa6e113"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.639129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" event={"ID":"9b35ce65-26df-4169-a69a-a06c2420da9b","Type":"ContainerStarted","Data":"8cb5450e357c68c14faae284da1699960d85e62174030355b01e1f46cfc0eb61"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.659057 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" event={"ID":"fa83e422-8374-4da6-a356-ae7feadfe282","Type":"ContainerStarted","Data":"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.659171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" event={"ID":"fa83e422-8374-4da6-a356-ae7feadfe282","Type":"ContainerStarted","Data":"f7d2345f249dbaf91c2c8a5a01c5410713a46fd54a5b82dd8d2bfc2612078219"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.671587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" event={"ID":"7bd67414-ec33-4963-b5d8-ac374bd28a6a","Type":"ContainerStarted","Data":"ddb4b99a5a2393ddeb25e4bdf3d4a75dc20440cef9e5754d4e0a4ef6003b322a"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.679644 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" event={"ID":"c14e937f-6acb-4f36-9432-eb77464ce9c9","Type":"ContainerStarted","Data":"ebf67c01653bd04950b609c0e16060df327abb0a30da02725d9ccd9e4cf56bdc"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.683252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29553120-6zw87" event={"ID":"8d83a64e-bd7c-43b4-aac4-8fdc807059f5","Type":"ContainerStarted","Data":"d4e3bb5b14629021056ad0f4d46c236a1c2f2aab672c6d18f7df9c13e7fad9d8"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.686419 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9dcwq" event={"ID":"9b886295-15e1-4478-9ecb-ab71e77b99eb","Type":"ContainerStarted","Data":"99990ff23642bbb99e9f3d0bf7dc3e8103378ac1ee2a49db5ed295b9b8661623"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.692715 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" event={"ID":"c2a93df7-e24f-4681-8336-ef07295f1d09","Type":"ContainerStarted","Data":"130427030cfcde642f17617a2f5e8524383ee3fc1fc5470fa4f4804d841103ec"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.699821 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" event={"ID":"7fb8ca9a-b3e7-4fce-b173-f8b2519962da","Type":"ContainerStarted","Data":"73b7c7ff9d422035e85775548ed05d4b24446c87ce4c95e73e0c62afc7ab386b"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.699880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" event={"ID":"7fb8ca9a-b3e7-4fce-b173-f8b2519962da","Type":"ContainerStarted","Data":"307baae4861b5a9dfec2faa6ced0ce237cede3653c3565c0b121b5726fd4105b"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.710252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jqrc" event={"ID":"969d9403-4eed-4c8b-b790-cae357bc60eb","Type":"ContainerStarted","Data":"3951f4aaa3865631e0b200f8450f5950f6b1d377e177a9613038b348bf7a6a86"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.716624 4744 generic.go:334] "Generic (PLEG): container finished" podID="24ac204b-1627-404f-b33c-fc77ded356d1" containerID="937b1cc514489ed01922be75b30fab38b8f7298aea15229dd66ea381125c36b8" exitCode=0 Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.720851 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" event={"ID":"24ac204b-1627-404f-b33c-fc77ded356d1","Type":"ContainerDied","Data":"937b1cc514489ed01922be75b30fab38b8f7298aea15229dd66ea381125c36b8"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.724911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.725730 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.225697367 +0000 UTC m=+247.029914962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.728241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.728261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7qgjq" event={"ID":"ea1125a6-231a-4f21-b9d6-8cdb2a51e482","Type":"ContainerStarted","Data":"6b9947bc7ded01bd30466398a5cb27d131b50bef8d51c2dc378008d5d5c54bfa"} Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.728885 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.228868205 +0000 UTC m=+247.033085810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.732309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" event={"ID":"c284cbf1-b5e2-4f77-b14c-c0030f140a91","Type":"ContainerStarted","Data":"75aa10e653681f6f775a1ca100b7b0b81554b5ef659dc2034faf2d1d0c8312de"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.732439 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" event={"ID":"c284cbf1-b5e2-4f77-b14c-c0030f140a91","Type":"ContainerStarted","Data":"9c9a637a2eb1df65d12ccc55dcde13caf2ad9754cf05667c6177ef55153c56d6"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.733962 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.738032 4744 patch_prober.go:28] interesting pod/console-operator-58897d9998-ptbbd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.738087 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" podUID="c284cbf1-b5e2-4f77-b14c-c0030f140a91" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.746859 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" event={"ID":"3834cb5e-8777-40cb-9a72-75c4d6fb5638","Type":"ContainerStarted","Data":"20a9cb47ecdd46ca355b6e393cc5ecd6784ef9253e63f71224493b2c3e126a25"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.760023 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" event={"ID":"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927","Type":"ContainerStarted","Data":"77774a19b0ad25944e27832abe837e72b689823bcf2a6b241449fadb957710a4"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.760064 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" event={"ID":"ab6c95e5-e2c0-4cb3-a647-930b4c4d2927","Type":"ContainerStarted","Data":"6a830821e9bb6e2c4a8a54f8d4dee9e9fcb486cf305bd9ea675a572a61d050c1"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.764416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rq8sg" event={"ID":"4e1cff98-4344-4952-8d6c-d4f5c8d58628","Type":"ContainerStarted","Data":"85e8d4fdbba98c629389b5c83e8a07a1e3ae9922efc49542eea8dad2819146c9"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.781876 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-btvmr" podStartSLOduration=184.781852416 podStartE2EDuration="3m4.781852416s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:09.779988408 +0000 UTC m=+246.584206013" watchObservedRunningTime="2026-03-11 00:58:09.781852416 +0000 UTC m=+246.586070021" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.783637 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kx277" event={"ID":"84d53914-9806-4bc3-80ed-19cd8ff6e625","Type":"ContainerStarted","Data":"5b1247ae9603e93f9f17de65de4e419ff0d18f52de489e92e38c1c558aebbabc"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.791766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" event={"ID":"3ca02643-7e55-4bab-be3a-781a8017f11b","Type":"ContainerStarted","Data":"1f0c7aef6f54afd62a61ac81c6ae80df2b3366af1925e5788301bd7b36ab9a92"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.805548 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" event={"ID":"eeb963e8-d683-4564-9ccd-d26a6a755e94","Type":"ContainerStarted","Data":"1435d4d118763530af60caa2ee19e0ec4b266d0b741f8b3a10e0689e680328e9"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.805603 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" event={"ID":"eeb963e8-d683-4564-9ccd-d26a6a755e94","Type":"ContainerStarted","Data":"9bde4c87307bb9290c989c485f4e6092800541f3c6ebb9be4fd94087092b03bf"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.818757 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" event={"ID":"ba3d981c-13e1-49dc-80db-4081ca811778","Type":"ContainerStarted","Data":"a7f82fe510bd3b2613194f055759ef486a731aeccf3f773362027ba9f68062b1"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.818808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" event={"ID":"ba3d981c-13e1-49dc-80db-4081ca811778","Type":"ContainerStarted","Data":"01dd443990da2936a0014d6a8e95c6658a36c396696548b28c0d86544696799e"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.819246 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.823640 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29553120-6zw87" podStartSLOduration=184.823619292 podStartE2EDuration="3m4.823619292s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:09.822612271 +0000 UTC m=+246.626829876" watchObservedRunningTime="2026-03-11 00:58:09.823619292 +0000 UTC m=+246.627836897" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.825950 4744 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kgl9z container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.826010 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" podUID="ba3d981c-13e1-49dc-80db-4081ca811778" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.829625 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.838309 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.338282493 +0000 UTC m=+247.142500098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.843580 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" event={"ID":"7886a95d-050e-4d58-baf1-e65310e95e4f","Type":"ContainerStarted","Data":"5d3aa7ce76e9b8ec4175eacfa5a162b45c45d11aa2824d20033cfd6fb7b29db1"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.850706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" event={"ID":"acc8a04b-d619-4e9a-b2b0-f08250f329e0","Type":"ContainerStarted","Data":"e6f2871ae9248f53e8131d367f4b01336c9d352ea81dac33ee8df8f34f5cdb27"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.870623 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" podStartSLOduration=184.870604199 podStartE2EDuration="3m4.870604199s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:09.867000868 +0000 UTC m=+246.671218473" watchObservedRunningTime="2026-03-11 00:58:09.870604199 +0000 UTC m=+246.674821804" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.879080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" event={"ID":"940c3a71-e220-417d-8ac4-cc70a4a5afae","Type":"ContainerStarted","Data":"fc252fa939962db437c7318dd18901324d0cf7ea9ba96c218c9ab80fbc080af7"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.882498 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" event={"ID":"5f720e42-33ed-4144-88d6-5fb8c4befac2","Type":"ContainerStarted","Data":"13b5010875155e1413fffef7441a5a2f641fdc6014d4d4fce329dd388c842992"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.886427 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" event={"ID":"74a61e39-2210-4bb1-96c9-509eda04c4c7","Type":"ContainerStarted","Data":"e604778b27de583f441bbc0c7365c48b9e027a61c73c490e9714df20eb53d75a"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.887675 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.889433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" event={"ID":"0359d6fc-1139-4dbc-a50f-55fa91607935","Type":"ContainerStarted","Data":"7c9d96f27b182f5b122e14c6dd843a13e97f6ac65f06057304c5073b18a67bff"} Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.894077 4744 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4tcts container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.894162 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.918538 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" podStartSLOduration=184.918485623 podStartE2EDuration="3m4.918485623s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:09.910911889 +0000 UTC m=+246.715129494" watchObservedRunningTime="2026-03-11 00:58:09.918485623 +0000 UTC m=+246.722703228" Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.933091 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:09 crc kubenswrapper[4744]: E0311 00:58:09.933412 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.433399332 +0000 UTC m=+247.237616937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:09 crc kubenswrapper[4744]: I0311 00:58:09.935289 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" event={"ID":"1ab34621-2907-423a-81ef-36fb8377874d","Type":"ContainerStarted","Data":"1620da2f3e33476de129e17c534401a46741e08d0664e2a2b1a84e0be5b52344"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.028934 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" event={"ID":"66fb3c39-3d00-453f-a282-a04584652a8b","Type":"ContainerStarted","Data":"86bd878e01237a51c1aa056a97f052156d7ac5d2750c333dbf40ea16fa5fc7fb"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.028977 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h2stx" event={"ID":"ca7c15de-4f71-45fd-b1b0-4c451ce9724e","Type":"ContainerStarted","Data":"98829dbb7741d683707e8425388a6ae896dc69bb1378b041d48848529b801c15"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.030475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" event={"ID":"a8e11ddd-81e3-40f9-8ada-12abfacedca9","Type":"ContainerStarted","Data":"2c464dff9f9fb3b4e6821c9b65054b020ea271e985bed6877c0ba8852b0bbb9e"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.032186 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" event={"ID":"48b9bd28-7713-4475-92f0-b7b741e6337e","Type":"ContainerStarted","Data":"3d5f13359de88e4c614680bfb0fff1220fc2c6342c1c302d63641b2c1e614e74"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.033540 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.033909 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.035374 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.535353321 +0000 UTC m=+247.339570926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.047502 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-msd9d" podStartSLOduration=185.047471095 podStartE2EDuration="3m5.047471095s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:09.976473188 +0000 UTC m=+246.780690803" watchObservedRunningTime="2026-03-11 00:58:10.047471095 +0000 UTC m=+246.851688710" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.062961 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.095810 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" podStartSLOduration=184.095768811 podStartE2EDuration="3m4.095768811s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.078688316 +0000 UTC m=+246.882905921" watchObservedRunningTime="2026-03-11 00:58:10.095768811 +0000 UTC m=+246.899986416" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.110288 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" event={"ID":"51238090-1fbe-446b-a63d-5ec9c3137c61","Type":"ContainerStarted","Data":"d452eafab1432819efc95e7df9591f69126347ccf9429ca2c4642cc367b52c03"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.111991 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" podStartSLOduration=185.111978581 podStartE2EDuration="3m5.111978581s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.103995436 +0000 UTC m=+246.908213041" watchObservedRunningTime="2026-03-11 00:58:10.111978581 +0000 UTC m=+246.916196186" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.140051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.141989 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.641972104 +0000 UTC m=+247.446189709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.149475 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9dcwq" podStartSLOduration=185.149424514 podStartE2EDuration="3m5.149424514s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.146814313 +0000 UTC m=+246.951031918" watchObservedRunningTime="2026-03-11 00:58:10.149424514 +0000 UTC m=+246.953642139" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.180890 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" podStartSLOduration=184.180867392 podStartE2EDuration="3m4.180867392s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.179024016 +0000 UTC m=+246.983241621" watchObservedRunningTime="2026-03-11 00:58:10.180867392 +0000 UTC m=+246.985084997" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.243476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.244954 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.744920344 +0000 UTC m=+247.549137949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.247386 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" event={"ID":"6fafcf3b-e706-47a8-9501-beb09a76d0bf","Type":"ContainerStarted","Data":"136e314dd03bef8c1e6ad2b6b097abc7544453952e1f22da0024bc06bdf329f8"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.254983 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-slpdp" podStartSLOduration=185.254964804 podStartE2EDuration="3m5.254964804s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.254269172 +0000 UTC m=+247.058486777" watchObservedRunningTime="2026-03-11 00:58:10.254964804 +0000 UTC m=+247.059182409" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.256380 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" event={"ID":"47244264-b9e7-4f86-85d7-5406ed8d8833","Type":"ContainerStarted","Data":"064351b6a768c8bc54c29aa89cc9fb44d4a1644e023c9487cd9475e466a5882e"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.257972 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-286h7" podStartSLOduration=185.257963466 podStartE2EDuration="3m5.257963466s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.233646237 +0000 UTC m=+247.037863842" watchObservedRunningTime="2026-03-11 00:58:10.257963466 +0000 UTC m=+247.062181071" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.277989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" event={"ID":"309dded2-bab3-4166-8342-11d3ded619dc","Type":"ContainerStarted","Data":"509b0f019652ad2649bc35e3650536fba1f3e475b946069ee04e6672646b9c3f"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.278046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" event={"ID":"309dded2-bab3-4166-8342-11d3ded619dc","Type":"ContainerStarted","Data":"a8b8871f84969ed1ca63e108fb7aad8e33883bdaa446b86678f0747814092917"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.292414 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-h2stx" podStartSLOduration=6.292391916 podStartE2EDuration="6.292391916s" podCreationTimestamp="2026-03-11 00:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.279619213 +0000 UTC m=+247.083836818" watchObservedRunningTime="2026-03-11 00:58:10.292391916 +0000 UTC m=+247.096609511" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.296163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" event={"ID":"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d","Type":"ContainerStarted","Data":"b00daf2a1420ac4bb9720d7b4d1a496d15c21992ecfe27ce9b356eb89567e5de"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.296212 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" event={"ID":"4d6c2f5e-32a4-484f-b6fa-a240cf8bb90d","Type":"ContainerStarted","Data":"a379b3f5dbac7666c1b7d63860d8c5993c62d152338e99527d5a17539047c467"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.309675 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xfdgm" podStartSLOduration=185.309655858 podStartE2EDuration="3m5.309655858s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.30842212 +0000 UTC m=+247.112639725" watchObservedRunningTime="2026-03-11 00:58:10.309655858 +0000 UTC m=+247.113873453" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.320101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" event={"ID":"acc45881-56ff-4010-8eda-103f41f90bc5","Type":"ContainerStarted","Data":"9a3737c6e78af65a5373b87c5ce4da2f7ca8c6d8ddc89870f19fcec1886ce519"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.328999 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.329095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" event={"ID":"61dac64e-6219-46fe-80b0-420098bb260b","Type":"ContainerStarted","Data":"718b4db56ae03958a62fbcaab6bdade1dee85f7fbdde2651336cf5d412656a4e"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.329129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" event={"ID":"61dac64e-6219-46fe-80b0-420098bb260b","Type":"ContainerStarted","Data":"47f141cb2ad03496047557c4dae8cd783f75f325b31c27e1463bf23299af1b34"} Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.340767 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.340842 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.346917 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.347313 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.847298267 +0000 UTC m=+247.651515862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.357545 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kx277" podStartSLOduration=184.357524752 podStartE2EDuration="3m4.357524752s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.352377223 +0000 UTC m=+247.156594828" watchObservedRunningTime="2026-03-11 00:58:10.357524752 +0000 UTC m=+247.161742367" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.400875 4744 ???:1] "http: TLS handshake error from 192.168.126.11:46982: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.413803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" podStartSLOduration=185.413780364 podStartE2EDuration="3m5.413780364s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.406224571 +0000 UTC m=+247.210442176" watchObservedRunningTime="2026-03-11 00:58:10.413780364 +0000 UTC m=+247.217997969" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.430895 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n8wns" podStartSLOduration=184.43087403 podStartE2EDuration="3m4.43087403s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.428376133 +0000 UTC m=+247.232593738" watchObservedRunningTime="2026-03-11 00:58:10.43087403 +0000 UTC m=+247.235091635" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.449245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.450853 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:10.950835425 +0000 UTC m=+247.755053030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.503743 4744 ???:1] "http: TLS handshake error from 192.168.126.11:46986: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.515195 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" podStartSLOduration=185.515177616 podStartE2EDuration="3m5.515177616s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.46692795 +0000 UTC m=+247.271145555" watchObservedRunningTime="2026-03-11 00:58:10.515177616 +0000 UTC m=+247.319395221" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.516085 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-l2fh5" podStartSLOduration=185.516078993 podStartE2EDuration="3m5.516078993s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.514921978 +0000 UTC m=+247.319139583" watchObservedRunningTime="2026-03-11 00:58:10.516078993 +0000 UTC m=+247.320296598" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.554049 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.554397 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.054384083 +0000 UTC m=+247.858601678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.558741 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-74tmz" podStartSLOduration=185.558723356 podStartE2EDuration="3m5.558723356s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:10.557453248 +0000 UTC m=+247.361670873" watchObservedRunningTime="2026-03-11 00:58:10.558723356 +0000 UTC m=+247.362940961" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.598834 4744 ???:1] "http: TLS handshake error from 192.168.126.11:46996: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.655538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.655842 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.155810046 +0000 UTC m=+247.960027641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.656208 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.656688 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.156673642 +0000 UTC m=+247.960891247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.741779 4744 ???:1] "http: TLS handshake error from 192.168.126.11:46998: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.757495 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.763380 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.263342047 +0000 UTC m=+248.067559652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.801951 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47000: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.859425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.860121 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.360106726 +0000 UTC m=+248.164324331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.935645 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47006: no serving certificate available for the kubelet" Mar 11 00:58:10 crc kubenswrapper[4744]: I0311 00:58:10.963299 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:10 crc kubenswrapper[4744]: E0311 00:58:10.963865 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.46384711 +0000 UTC m=+248.268064715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.020133 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47016: no serving certificate available for the kubelet" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.065313 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.066083 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.566069788 +0000 UTC m=+248.370287393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.170136 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.170527 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.670491974 +0000 UTC m=+248.474709579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.227432 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47030: no serving certificate available for the kubelet" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.274643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.275055 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.775042423 +0000 UTC m=+248.579260018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.329630 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:11 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:11 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:11 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.329704 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.376561 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.376681 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.876662912 +0000 UTC m=+248.680880517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.376942 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.377361 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.877352603 +0000 UTC m=+248.681570208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.384585 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" event={"ID":"acc45881-56ff-4010-8eda-103f41f90bc5","Type":"ContainerStarted","Data":"c1472d1a466b2d2e5238f5be8b7394abfe758550a25251c7be459739811c29fd"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.384645 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" event={"ID":"acc45881-56ff-4010-8eda-103f41f90bc5","Type":"ContainerStarted","Data":"a94b6b9da91cc8fd5dab157bccd81e13db027123bceb2f868cb7b83e914cc0f5"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.396931 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jqrc" event={"ID":"969d9403-4eed-4c8b-b790-cae357bc60eb","Type":"ContainerStarted","Data":"6a41ed06bc4791c14ab423b1963e2b85409d24e098e75f6fe9f753d39cf835a8"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.420181 4744 generic.go:334] "Generic (PLEG): container finished" podID="c2a93df7-e24f-4681-8336-ef07295f1d09" containerID="d6804c40789608748116dbdc7744a20f91753307d35ac3d8389fafe5b58aa8ff" exitCode=0 Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.420290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" event={"ID":"c2a93df7-e24f-4681-8336-ef07295f1d09","Type":"ContainerDied","Data":"d6804c40789608748116dbdc7744a20f91753307d35ac3d8389fafe5b58aa8ff"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.448167 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pn84h" podStartSLOduration=186.448148873 podStartE2EDuration="3m6.448148873s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.414560808 +0000 UTC m=+248.218778413" watchObservedRunningTime="2026-03-11 00:58:11.448148873 +0000 UTC m=+248.252366478" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.449887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" event={"ID":"7886a95d-050e-4d58-baf1-e65310e95e4f","Type":"ContainerStarted","Data":"027fe3efb836951232e1afa930e3f986b4fabc10c31a5c88ae0a8c227eca79d0"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.456171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kx277" event={"ID":"84d53914-9806-4bc3-80ed-19cd8ff6e625","Type":"ContainerStarted","Data":"59d78aef4c2af90528be843aaa9dc07258c1ebb70c80dcb6ebfc9311e67594e5"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.478278 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.479167 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:11.979133457 +0000 UTC m=+248.783351062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.485714 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6jqrc" podStartSLOduration=6.485696169 podStartE2EDuration="6.485696169s" podCreationTimestamp="2026-03-11 00:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.447912275 +0000 UTC m=+248.252129880" watchObservedRunningTime="2026-03-11 00:58:11.485696169 +0000 UTC m=+248.289913774" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.518649 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" event={"ID":"3ca02643-7e55-4bab-be3a-781a8017f11b","Type":"ContainerStarted","Data":"a77b8962887eb86a0c8d17aadba352055bd113cb1663de1b2f3d660b90fdb0e9"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.521074 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.526469 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" podStartSLOduration=186.526451284 podStartE2EDuration="3m6.526451284s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.525226176 +0000 UTC m=+248.329443781" watchObservedRunningTime="2026-03-11 00:58:11.526451284 +0000 UTC m=+248.330668889" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.543262 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7qgjq" event={"ID":"ea1125a6-231a-4f21-b9d6-8cdb2a51e482","Type":"ContainerStarted","Data":"3c88aa6f67406efdf705dfdc9b68dca3b4032a0ad79ccfdf856ee1cb7c9a3359"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.544448 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.553759 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-7qgjq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.553810 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7qgjq" podUID="ea1125a6-231a-4f21-b9d6-8cdb2a51e482" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.557793 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" event={"ID":"9b35ce65-26df-4169-a69a-a06c2420da9b","Type":"ContainerStarted","Data":"c7cdd5f3a90eba9ee6e80ed572a25e7b22c3cdd960e0b16e540ca885e782ac96"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.583489 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.585276 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.085263474 +0000 UTC m=+248.889481079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.588410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9dcwq" event={"ID":"9b886295-15e1-4478-9ecb-ab71e77b99eb","Type":"ContainerStarted","Data":"06f1a61ee6bc87b792faa483f5beea7c15614797067045abb6d64a068b85c171"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.611373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" event={"ID":"bef5e0c2-5b78-4f32-affd-aec245c27db1","Type":"ContainerStarted","Data":"a0a5453ba0e58007752134e0c1bcc51f1e78fe9d0a7ff269d83aa46705941825"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.611956 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.617697 4744 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fhbhq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.617764 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" podUID="bef5e0c2-5b78-4f32-affd-aec245c27db1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.655569 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" event={"ID":"eeb963e8-d683-4564-9ccd-d26a6a755e94","Type":"ContainerStarted","Data":"ee7d5bbf4a77e9f2d94551f22848444b8546e9e8cfcdf86f1fd6dce6b3b6d3e9"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.661895 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" event={"ID":"1ab34621-2907-423a-81ef-36fb8377874d","Type":"ContainerStarted","Data":"363d81e0c4bd1ccd933fce714485cf8485fc234c798494b4b0bee4d120c56e56"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.684222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.685532 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.185420429 +0000 UTC m=+248.989638034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.718758 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6k4hw" podStartSLOduration=186.718737455 podStartE2EDuration="3m6.718737455s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.717487126 +0000 UTC m=+248.521704731" watchObservedRunningTime="2026-03-11 00:58:11.718737455 +0000 UTC m=+248.522955060" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.734699 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" event={"ID":"66fb3c39-3d00-453f-a282-a04584652a8b","Type":"ContainerStarted","Data":"33349a80ddc81640cedaca7eee05c89ba9e879149874e74f4fe5deafd01e80ee"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.734778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" event={"ID":"66fb3c39-3d00-453f-a282-a04584652a8b","Type":"ContainerStarted","Data":"a56c667a44d684d082f427f84155e97c2af12aa8fc5ceeef6c8d9d03bc810102"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.745712 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.770114 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wm2rt" event={"ID":"de688bff-78ce-4d0f-ad7e-548ca640887a","Type":"ContainerStarted","Data":"6c3d67a983e140ad83ce41fc998e82a8f743eb23c9c92b120586507f799f33a0"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.774714 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" event={"ID":"3834cb5e-8777-40cb-9a72-75c4d6fb5638","Type":"ContainerStarted","Data":"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.775881 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.787161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.788884 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k6qjd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.788944 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.790109 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.290094402 +0000 UTC m=+249.094312007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.801521 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" event={"ID":"940c3a71-e220-417d-8ac4-cc70a4a5afae","Type":"ContainerStarted","Data":"245c4b7291d15403ba833b335c99bdfcbedde6d7891f3161fb8d63fe294edca9"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.801597 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" event={"ID":"940c3a71-e220-417d-8ac4-cc70a4a5afae","Type":"ContainerStarted","Data":"653be9d1eca6fd6b95e423b4409a27f61467ff9a517c0d5110985102306dbdf9"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.801714 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.805369 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" event={"ID":"0359d6fc-1139-4dbc-a50f-55fa91607935","Type":"ContainerStarted","Data":"c1dde7cdd8fd0405297eef3037e1addc2ac0e14e16cc291b49c838b607c3d397"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.841845 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" podStartSLOduration=186.841815405 podStartE2EDuration="3m6.841815405s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.841778813 +0000 UTC m=+248.645996418" watchObservedRunningTime="2026-03-11 00:58:11.841815405 +0000 UTC m=+248.646033010" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.846731 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4jlsm" event={"ID":"47244264-b9e7-4f86-85d7-5406ed8d8833","Type":"ContainerStarted","Data":"a7e9e8fd8f7ca56f11dd769385d8f6ccc87492aed8cf6ab15aee10e498589882"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.852269 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" event={"ID":"6fafcf3b-e706-47a8-9501-beb09a76d0bf","Type":"ContainerStarted","Data":"567271e0ffe77872623f985b414d765770ec9c1f0c78b8c42e2ee37a767feb6e"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.852346 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" event={"ID":"6fafcf3b-e706-47a8-9501-beb09a76d0bf","Type":"ContainerStarted","Data":"182289030ceda3676f2717c1969328d2b2228c50e71e427bf53cd5cc22334d17"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.883703 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" event={"ID":"7fb8ca9a-b3e7-4fce-b173-f8b2519962da","Type":"ContainerStarted","Data":"9f3689a77d70e9153290522a5973b695f906bc4d14841961aae1220ae84d3961"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.888573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.889654 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.389635667 +0000 UTC m=+249.193853272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.911157 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94dsr" podStartSLOduration=186.911137439 podStartE2EDuration="3m6.911137439s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.908152177 +0000 UTC m=+248.712369782" watchObservedRunningTime="2026-03-11 00:58:11.911137439 +0000 UTC m=+248.715355054" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.921212 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rq8sg" event={"ID":"4e1cff98-4344-4952-8d6c-d4f5c8d58628","Type":"ContainerStarted","Data":"d92da5bc98e3a90b982354c2d5dcc06e7cfb331a5d1d6561a1bbe2c5ff5bcdbd"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.921985 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.967285 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" event={"ID":"24ac204b-1627-404f-b33c-fc77ded356d1","Type":"ContainerStarted","Data":"e54d2dba443c55d5600d84d02c8fffbfdc7521668d7f0591b5fc7af32c5cb1e9"} Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.977203 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f8rpg" podStartSLOduration=186.977181932 podStartE2EDuration="3m6.977181932s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:11.952038428 +0000 UTC m=+248.756256043" watchObservedRunningTime="2026-03-11 00:58:11.977181932 +0000 UTC m=+248.781399537" Mar 11 00:58:11 crc kubenswrapper[4744]: I0311 00:58:11.991614 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:11 crc kubenswrapper[4744]: E0311 00:58:11.993360 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.49334499 +0000 UTC m=+249.297562595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.010950 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7qgjq" podStartSLOduration=187.010930591 podStartE2EDuration="3m7.010930591s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.010857119 +0000 UTC m=+248.815074724" watchObservedRunningTime="2026-03-11 00:58:12.010930591 +0000 UTC m=+248.815148196" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.036072 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47038: no serving certificate available for the kubelet" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.049223 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" event={"ID":"7d1c92dd-43a7-4311-90b1-54441f84787e","Type":"ContainerStarted","Data":"665230a660fa06de0099e56b6bf1f5c4819c885f1e1964b437a913d738d29a4b"} Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.080101 4744 generic.go:334] "Generic (PLEG): container finished" podID="5f720e42-33ed-4144-88d6-5fb8c4befac2" containerID="22939e4e39c9f840c33a2600719ccf362c2355aa9c8dd0917031f98c42c4ea6b" exitCode=0 Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.082452 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" event={"ID":"5f720e42-33ed-4144-88d6-5fb8c4befac2","Type":"ContainerDied","Data":"22939e4e39c9f840c33a2600719ccf362c2355aa9c8dd0917031f98c42c4ea6b"} Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.092924 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.093198 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.593167544 +0000 UTC m=+249.397385149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.093549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.094049 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.594038031 +0000 UTC m=+249.398255636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.122283 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gnfcr" podStartSLOduration=187.12225469 podStartE2EDuration="3m7.12225469s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.03003193 +0000 UTC m=+248.834249535" watchObservedRunningTime="2026-03-11 00:58:12.12225469 +0000 UTC m=+248.926472295" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.122812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" event={"ID":"a8e11ddd-81e3-40f9-8ada-12abfacedca9","Type":"ContainerStarted","Data":"4f69ef6c77a38a3d59baffb623a8ca8b19b3c2db2b6323907099926e0034d2f5"} Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.122927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" event={"ID":"a8e11ddd-81e3-40f9-8ada-12abfacedca9","Type":"ContainerStarted","Data":"d09d00811b42c636da98be2e3c2dc7e62b825364104f7023eb20d06a1a17799c"} Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.162780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" event={"ID":"acc8a04b-d619-4e9a-b2b0-f08250f329e0","Type":"ContainerStarted","Data":"4fb1c2daf838d69e874b29d5d1b2058469d7299c280ea1317d6fd2ba735f09ec"} Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.165814 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.166682 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.181653 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.181892 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.183275 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" podStartSLOduration=187.183252338 podStartE2EDuration="3m7.183252338s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.124071815 +0000 UTC m=+248.928289420" watchObservedRunningTime="2026-03-11 00:58:12.183252338 +0000 UTC m=+248.987469933" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.185121 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" podStartSLOduration=186.185115495 podStartE2EDuration="3m6.185115495s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.182324319 +0000 UTC m=+248.986541924" watchObservedRunningTime="2026-03-11 00:58:12.185115495 +0000 UTC m=+248.989333100" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.195911 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.197689 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.198726 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.698707603 +0000 UTC m=+249.502925208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.273047 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wcmgb" podStartSLOduration=187.273018642 podStartE2EDuration="3m7.273018642s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.219097991 +0000 UTC m=+249.023315596" watchObservedRunningTime="2026-03-11 00:58:12.273018642 +0000 UTC m=+249.077236247" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.306930 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.311343 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.811328881 +0000 UTC m=+249.615546486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.328269 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rq8sg" podStartSLOduration=8.328235812 podStartE2EDuration="8.328235812s" podCreationTimestamp="2026-03-11 00:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.274566219 +0000 UTC m=+249.078783824" watchObservedRunningTime="2026-03-11 00:58:12.328235812 +0000 UTC m=+249.132453417" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.330116 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-chlpc" podStartSLOduration=187.330111339 podStartE2EDuration="3m7.330111339s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.327607252 +0000 UTC m=+249.131824857" watchObservedRunningTime="2026-03-11 00:58:12.330111339 +0000 UTC m=+249.134328944" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.345038 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:12 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:12 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:12 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.345113 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.412287 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.412354 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.412800 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.413263 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:12.913177987 +0000 UTC m=+249.717395582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.514327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.514756 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.014742634 +0000 UTC m=+249.818960239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.570131 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-85z54" podStartSLOduration=187.57011213 podStartE2EDuration="3m7.57011213s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.384813284 +0000 UTC m=+249.189030889" watchObservedRunningTime="2026-03-11 00:58:12.57011213 +0000 UTC m=+249.374329735" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.570760 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t54km" podStartSLOduration=187.570756229 podStartE2EDuration="3m7.570756229s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.569684426 +0000 UTC m=+249.373902031" watchObservedRunningTime="2026-03-11 00:58:12.570756229 +0000 UTC m=+249.374973834" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.632962 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.633601 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.133569873 +0000 UTC m=+249.937787478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.643699 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.735381 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.736434 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.23641972 +0000 UTC m=+250.040637325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.771635 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bbjh8" podStartSLOduration=187.771618374 podStartE2EDuration="3m7.771618374s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.768269671 +0000 UTC m=+249.572487276" watchObservedRunningTime="2026-03-11 00:58:12.771618374 +0000 UTC m=+249.575835979" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.806597 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v4b66" podStartSLOduration=187.80657263 podStartE2EDuration="3m7.80657263s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.806188458 +0000 UTC m=+249.610406063" watchObservedRunningTime="2026-03-11 00:58:12.80657263 +0000 UTC m=+249.610790235" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.824022 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kgl9z" Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.837096 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.837583 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.337564314 +0000 UTC m=+250.141781919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.940812 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:12 crc kubenswrapper[4744]: E0311 00:58:12.941242 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.441220796 +0000 UTC m=+250.245438401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:12 crc kubenswrapper[4744]: I0311 00:58:12.949118 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lws9c" podStartSLOduration=187.949096478 podStartE2EDuration="3m7.949096478s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:12.903099683 +0000 UTC m=+249.707317278" watchObservedRunningTime="2026-03-11 00:58:12.949096478 +0000 UTC m=+249.753314083" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.042124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.042481 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.542463844 +0000 UTC m=+250.346681449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.143728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.144178 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.644158725 +0000 UTC m=+250.448376330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.165410 4744 patch_prober.go:28] interesting pod/console-operator-58897d9998-ptbbd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.165470 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" podUID="c284cbf1-b5e2-4f77-b14c-c0030f140a91" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.225431 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rq8sg" event={"ID":"4e1cff98-4344-4952-8d6c-d4f5c8d58628","Type":"ContainerStarted","Data":"c88cca960c9a5d02bda2f6bf98a410aaa6ef6e21ef4e89dac096fe90ba2678fb"} Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.244928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.245162 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.745130874 +0000 UTC m=+250.549348479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.245350 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.245787 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.745773723 +0000 UTC m=+250.549991328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.271149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" event={"ID":"24ac204b-1627-404f-b33c-fc77ded356d1","Type":"ContainerStarted","Data":"37e34d1d18ddbdfb387209f7aaafed9b3dd36bd6936969478329141e21029576"} Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.286613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" event={"ID":"5f720e42-33ed-4144-88d6-5fb8c4befac2","Type":"ContainerStarted","Data":"0445b6293d9e283abfbd9e784e891616de2fc6690684e9f8ecb0168a83142c4a"} Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.331636 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:13 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:13 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:13 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.331713 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.331827 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" podStartSLOduration=188.331806553 podStartE2EDuration="3m8.331806553s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:13.328902934 +0000 UTC m=+250.133120539" watchObservedRunningTime="2026-03-11 00:58:13.331806553 +0000 UTC m=+250.136024158" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.341782 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" event={"ID":"c14e937f-6acb-4f36-9432-eb77464ce9c9","Type":"ContainerStarted","Data":"b9fd35eedbaf806ff9e7532812655bb013d31c4c6a5542ba84cf699b7b3d8edd"} Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.346522 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.347622 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.847571098 +0000 UTC m=+250.651788703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.366542 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" event={"ID":"c2a93df7-e24f-4681-8336-ef07295f1d09","Type":"ContainerStarted","Data":"f3d9bf944c025612fe14160e18626e9b07648cebe9459f8277d45072eb82e8a1"} Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.368025 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k6qjd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.368019 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-7qgjq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.368110 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7qgjq" podUID="ea1125a6-231a-4f21-b9d6-8cdb2a51e482" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.368071 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.369387 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" podUID="8495f637-031c-4280-be13-d5aae9c99eca" containerName="controller-manager" containerID="cri-o://ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00" gracePeriod=30 Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.369756 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" podUID="fa83e422-8374-4da6-a356-ae7feadfe282" containerName="route-controller-manager" containerID="cri-o://1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47" gracePeriod=30 Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.375974 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbhq" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.411778 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47042: no serving certificate available for the kubelet" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.422796 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ptbbd" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.446965 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" podStartSLOduration=187.446922307 podStartE2EDuration="3m7.446922307s" podCreationTimestamp="2026-03-11 00:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:13.388181058 +0000 UTC m=+250.192398663" watchObservedRunningTime="2026-03-11 00:58:13.446922307 +0000 UTC m=+250.251139902" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.449020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.458412 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:13.95839612 +0000 UTC m=+250.762613725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.499194 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" podStartSLOduration=188.499172986 podStartE2EDuration="3m8.499172986s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:13.456656717 +0000 UTC m=+250.260874322" watchObservedRunningTime="2026-03-11 00:58:13.499172986 +0000 UTC m=+250.303390591" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.556896 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.558962 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.058930906 +0000 UTC m=+250.863148511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.661762 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.662143 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.162129614 +0000 UTC m=+250.966347219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.765804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.766671 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.266649512 +0000 UTC m=+251.070867117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.769039 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.770127 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.774610 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.825939 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.868129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.868215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kctvp\" (UniqueName: \"kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.868245 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.868262 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.868629 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.368602251 +0000 UTC m=+251.172819856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.939343 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.972325 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.973368 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.974305 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.974602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kctvp\" (UniqueName: \"kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.974647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.974661 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.975178 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: E0311 00:58:13.975244 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.475231095 +0000 UTC m=+251.279448700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.975735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.977874 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 00:58:13 crc kubenswrapper[4744]: I0311 00:58:13.992065 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.028089 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kctvp\" (UniqueName: \"kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp\") pod \"certified-operators-n9qr2\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.076384 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9dj\" (UniqueName: \"kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.076428 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.076483 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.076562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.076898 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.576884664 +0000 UTC m=+251.381102269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.099942 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.177271 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.177973 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.178226 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.178328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9dj\" (UniqueName: \"kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.178367 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.178498 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.178858 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.178961 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.678942207 +0000 UTC m=+251.483159812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.179163 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.200271 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.206660 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.228338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9dj\" (UniqueName: \"kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj\") pod \"community-operators-kk7lx\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.257079 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.257333 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa83e422-8374-4da6-a356-ae7feadfe282" containerName="route-controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.257351 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa83e422-8374-4da6-a356-ae7feadfe282" containerName="route-controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.257443 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa83e422-8374-4da6-a356-ae7feadfe282" containerName="route-controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.257882 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.273181 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279020 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca\") pod \"fa83e422-8374-4da6-a356-ae7feadfe282\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279229 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wdr\" (UniqueName: \"kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr\") pod \"fa83e422-8374-4da6-a356-ae7feadfe282\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279352 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config\") pod \"fa83e422-8374-4da6-a356-ae7feadfe282\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279419 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert\") pod \"fa83e422-8374-4da6-a356-ae7feadfe282\" (UID: \"fa83e422-8374-4da6-a356-ae7feadfe282\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbsj6\" (UniqueName: \"kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279782 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.279816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.281023 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa83e422-8374-4da6-a356-ae7feadfe282" (UID: "fa83e422-8374-4da6-a356-ae7feadfe282"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.281219 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config" (OuterVolumeSpecName: "config") pod "fa83e422-8374-4da6-a356-ae7feadfe282" (UID: "fa83e422-8374-4da6-a356-ae7feadfe282"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.282089 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.782059812 +0000 UTC m=+251.586277417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.297962 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr" (OuterVolumeSpecName: "kube-api-access-t2wdr") pod "fa83e422-8374-4da6-a356-ae7feadfe282" (UID: "fa83e422-8374-4da6-a356-ae7feadfe282"). InnerVolumeSpecName "kube-api-access-t2wdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.298411 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa83e422-8374-4da6-a356-ae7feadfe282" (UID: "fa83e422-8374-4da6-a356-ae7feadfe282"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.340373 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.346772 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:14 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:14 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:14 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.346844 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.376977 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.383625 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.383834 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsthj\" (UniqueName: \"kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.383878 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.383932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.385144 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.885113825 +0000 UTC m=+251.689331430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.385998 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.384857 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386294 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbsj6\" (UniqueName: \"kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386484 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386552 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386573 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa83e422-8374-4da6-a356-ae7feadfe282-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386587 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83e422-8374-4da6-a356-ae7feadfe282-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386600 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wdr\" (UniqueName: \"kubernetes.io/projected/fa83e422-8374-4da6-a356-ae7feadfe282-kube-api-access-t2wdr\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.386887 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.387190 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:14.887173709 +0000 UTC m=+251.691391554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.400024 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.400272 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8495f637-031c-4280-be13-d5aae9c99eca" containerName="controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.400289 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8495f637-031c-4280-be13-d5aae9c99eca" containerName="controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.400416 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8495f637-031c-4280-be13-d5aae9c99eca" containerName="controller-manager" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.401094 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.429343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbsj6\" (UniqueName: \"kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6\") pod \"certified-operators-9zdvr\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.432256 4744 generic.go:334] "Generic (PLEG): container finished" podID="fa83e422-8374-4da6-a356-ae7feadfe282" containerID="1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47" exitCode=0 Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.432331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" event={"ID":"fa83e422-8374-4da6-a356-ae7feadfe282","Type":"ContainerDied","Data":"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47"} Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.432365 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" event={"ID":"fa83e422-8374-4da6-a356-ae7feadfe282","Type":"ContainerDied","Data":"f7d2345f249dbaf91c2c8a5a01c5410713a46fd54a5b82dd8d2bfc2612078219"} Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.432383 4744 scope.go:117] "RemoveContainer" containerID="1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.432501 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.444185 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.450982 4744 generic.go:334] "Generic (PLEG): container finished" podID="8495f637-031c-4280-be13-d5aae9c99eca" containerID="ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00" exitCode=0 Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.452165 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" event={"ID":"8495f637-031c-4280-be13-d5aae9c99eca","Type":"ContainerDied","Data":"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00"} Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.452204 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" event={"ID":"8495f637-031c-4280-be13-d5aae9c99eca","Type":"ContainerDied","Data":"1dafcd2dab172541bc2474afc4c6eb93613d4c87fd732e0897b5697b82962078"} Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.452257 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gtcq" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.454025 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-7qgjq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.454099 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7qgjq" podUID="ea1125a6-231a-4f21-b9d6-8cdb2a51e482" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.483358 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca\") pod \"8495f637-031c-4280-be13-d5aae9c99eca\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488770 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config\") pod \"8495f637-031c-4280-be13-d5aae9c99eca\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488793 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert\") pod \"8495f637-031c-4280-be13-d5aae9c99eca\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488930 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488965 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles\") pod \"8495f637-031c-4280-be13-d5aae9c99eca\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.488999 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82zqb\" (UniqueName: \"kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb\") pod \"8495f637-031c-4280-be13-d5aae9c99eca\" (UID: \"8495f637-031c-4280-be13-d5aae9c99eca\") " Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsthj\" (UniqueName: \"kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489256 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbm2q\" (UniqueName: \"kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489275 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489358 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.489395 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.490765 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config" (OuterVolumeSpecName: "config") pod "8495f637-031c-4280-be13-d5aae9c99eca" (UID: "8495f637-031c-4280-be13-d5aae9c99eca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.491439 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca" (OuterVolumeSpecName: "client-ca") pod "8495f637-031c-4280-be13-d5aae9c99eca" (UID: "8495f637-031c-4280-be13-d5aae9c99eca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.503960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.512946 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8495f637-031c-4280-be13-d5aae9c99eca" (UID: "8495f637-031c-4280-be13-d5aae9c99eca"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.513439 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.519205 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.019158972 +0000 UTC m=+251.823376577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.523634 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8495f637-031c-4280-be13-d5aae9c99eca" (UID: "8495f637-031c-4280-be13-d5aae9c99eca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.528734 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb" (OuterVolumeSpecName: "kube-api-access-82zqb") pod "8495f637-031c-4280-be13-d5aae9c99eca" (UID: "8495f637-031c-4280-be13-d5aae9c99eca"). InnerVolumeSpecName "kube-api-access-82zqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.529197 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.531678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf5rf" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.533049 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.542969 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsthj\" (UniqueName: \"kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj\") pod \"route-controller-manager-5bccdc78db-64drh\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590709 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbm2q\" (UniqueName: \"kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590777 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590813 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590855 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590866 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82zqb\" (UniqueName: \"kubernetes.io/projected/8495f637-031c-4280-be13-d5aae9c99eca-kube-api-access-82zqb\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590877 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590887 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8495f637-031c-4280-be13-d5aae9c99eca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.590897 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8495f637-031c-4280-be13-d5aae9c99eca-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.591430 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.591745 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.091730868 +0000 UTC m=+251.895948473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.592086 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.618346 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbm2q\" (UniqueName: \"kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q\") pod \"community-operators-qqw9j\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.637932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.692616 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.694758 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.194724329 +0000 UTC m=+251.998942134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.733634 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.777444 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.794545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.795010 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.294995626 +0000 UTC m=+252.099213231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.800331 4744 scope.go:117] "RemoveContainer" containerID="1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.805377 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqzmz"] Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.807864 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47\": container with ID starting with 1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47 not found: ID does not exist" containerID="1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.807923 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47"} err="failed to get container status \"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47\": rpc error: code = NotFound desc = could not find container \"1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47\": container with ID starting with 1ac0bd62633a42c5cbeb7e5e2b54cf8b838860f7c3a82c4b2d9fd3a969f4de47 not found: ID does not exist" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.807956 4744 scope.go:117] "RemoveContainer" containerID="ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00" Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.849175 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.851202 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.853902 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gtcq"] Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.898791 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:14 crc kubenswrapper[4744]: E0311 00:58:14.899154 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.399132423 +0000 UTC m=+252.203350018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:14 crc kubenswrapper[4744]: I0311 00:58:14.972055 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.012258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: E0311 00:58:15.012808 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.512788452 +0000 UTC m=+252.317006057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.099724 4744 scope.go:117] "RemoveContainer" containerID="ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00" Mar 11 00:58:15 crc kubenswrapper[4744]: E0311 00:58:15.111355 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00\": container with ID starting with ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00 not found: ID does not exist" containerID="ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.111409 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00"} err="failed to get container status \"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00\": rpc error: code = NotFound desc = could not find container \"ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00\": container with ID starting with ba9d905404c5f00fbca28dcf442ba28796ae4d0593acd3e3da2353598f146b00 not found: ID does not exist" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.113772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:15 crc kubenswrapper[4744]: E0311 00:58:15.114235 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.614215585 +0000 UTC m=+252.418433190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.128150 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.171808 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.181002 4744 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 11 00:58:15 crc kubenswrapper[4744]: W0311 00:58:15.204091 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc99af5e_bf41_49ac_8e4a_416f565cbfc9.slice/crio-9ba30dae4eade289d2d3a632a8d4670c27925cc289f4f5ba34d51c8926b7dc67 WatchSource:0}: Error finding container 9ba30dae4eade289d2d3a632a8d4670c27925cc289f4f5ba34d51c8926b7dc67: Status 404 returned error can't find the container with id 9ba30dae4eade289d2d3a632a8d4670c27925cc289f4f5ba34d51c8926b7dc67 Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.215901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: E0311 00:58:15.217210 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 00:58:15.717187406 +0000 UTC m=+252.521405011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg2gg" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.224289 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:58:15 crc kubenswrapper[4744]: W0311 00:58:15.247916 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ed9886_b3a9_4ea4_ac2a_2abd59a4b1fe.slice/crio-2df54e782d73ddaed638d5d19d951ca97d7e3f6c2f3de135e0a5204ecd1ee8ef WatchSource:0}: Error finding container 2df54e782d73ddaed638d5d19d951ca97d7e3f6c2f3de135e0a5204ecd1ee8ef: Status 404 returned error can't find the container with id 2df54e782d73ddaed638d5d19d951ca97d7e3f6c2f3de135e0a5204ecd1ee8ef Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.251075 4744 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-11T00:58:15.181038973Z","Handler":null,"Name":""} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.258237 4744 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.258290 4744 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.317038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.324166 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.328121 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:15 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:15 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:15 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.328193 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.419709 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.445472 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.445563 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.466102 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerStarted","Data":"2df54e782d73ddaed638d5d19d951ca97d7e3f6c2f3de135e0a5204ecd1ee8ef"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.468355 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerStarted","Data":"631b1eb69f9b4638de5ac0d6d82ee56d7f16a5b14b07fd079958d79533493fac"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.468378 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerStarted","Data":"9ba30dae4eade289d2d3a632a8d4670c27925cc289f4f5ba34d51c8926b7dc67"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.477854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg2gg\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.479391 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" event={"ID":"c14e937f-6acb-4f36-9432-eb77464ce9c9","Type":"ContainerStarted","Data":"56c27d1906b07f1e7a32f3695aff1055d132131512ee49c7b1e1ff3921966e9c"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.486208 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerID="a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea" exitCode=0 Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.486261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerDied","Data":"a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.486288 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerStarted","Data":"2c0ee2ddde362c1eb039f7a433a35d9f76ff924ff3dcc975c503639371f62178"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.491481 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" event={"ID":"cce9002e-ff71-445e-96e6-2d0e7eae739a","Type":"ContainerStarted","Data":"0c77f4b460433a157870b1e8fcc50a0e7ddc17993ec7d2a08e4d59cb594575a4"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.491557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" event={"ID":"cce9002e-ff71-445e-96e6-2d0e7eae739a","Type":"ContainerStarted","Data":"048419a197c455d19087d9c82dd7f096dafce676f617d8819bbc53ced8ff3d69"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.492073 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.494280 4744 generic.go:334] "Generic (PLEG): container finished" podID="7886a95d-050e-4d58-baf1-e65310e95e4f" containerID="027fe3efb836951232e1afa930e3f986b4fabc10c31a5c88ae0a8c227eca79d0" exitCode=0 Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.494375 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" event={"ID":"7886a95d-050e-4d58-baf1-e65310e95e4f","Type":"ContainerDied","Data":"027fe3efb836951232e1afa930e3f986b4fabc10c31a5c88ae0a8c227eca79d0"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.497891 4744 generic.go:334] "Generic (PLEG): container finished" podID="a870760b-88e5-4526-8f91-ef89201e2a13" containerID="60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811" exitCode=0 Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.499048 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerDied","Data":"60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.499077 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerStarted","Data":"3a13bf641beff9c011ba04f462dfb99d16ccb2ab1b0cefb982fcb7a0f0aaadd5"} Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.558367 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" podStartSLOduration=3.55834803 podStartE2EDuration="3.55834803s" podCreationTimestamp="2026-03-11 00:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:15.553617765 +0000 UTC m=+252.357835370" watchObservedRunningTime="2026-03-11 00:58:15.55834803 +0000 UTC m=+252.362565635" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.761733 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.763415 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.765123 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.767836 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.770253 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.782018 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.831379 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.831480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.831529 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drz7\" (UniqueName: \"kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.933154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drz7\" (UniqueName: \"kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.933266 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.933333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.933867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.934140 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.963334 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drz7\" (UniqueName: \"kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7\") pod \"redhat-marketplace-pfjjz\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.983611 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8495f637-031c-4280-be13-d5aae9c99eca" path="/var/lib/kubelet/pods/8495f637-031c-4280-be13-d5aae9c99eca/volumes" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.984369 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 11 00:58:15 crc kubenswrapper[4744]: I0311 00:58:15.985008 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa83e422-8374-4da6-a356-ae7feadfe282" path="/var/lib/kubelet/pods/fa83e422-8374-4da6-a356-ae7feadfe282/volumes" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.004003 4744 ???:1] "http: TLS handshake error from 192.168.126.11:47056: no serving certificate available for the kubelet" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.055574 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 00:58:16 crc kubenswrapper[4744]: W0311 00:58:16.074908 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b9dce4_e1bd_400e_a4c2_848e9703db45.slice/crio-1e33b8f987bb93b408f4eb2457682589f6a79902d41a193b38f0639969f25e4f WatchSource:0}: Error finding container 1e33b8f987bb93b408f4eb2457682589f6a79902d41a193b38f0639969f25e4f: Status 404 returned error can't find the container with id 1e33b8f987bb93b408f4eb2457682589f6a79902d41a193b38f0639969f25e4f Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.106952 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.160771 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.161977 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.170802 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.240668 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.241217 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69df\" (UniqueName: \"kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.241732 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.263699 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.277594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.281614 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.285838 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.286138 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.293444 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.293707 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.294683 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.301355 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.302561 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.330331 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:16 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:16 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:16 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.330392 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344278 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344339 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344363 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344397 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344421 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69df\" (UniqueName: \"kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344780 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7cpr\" (UniqueName: \"kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.344821 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.345368 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.350073 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.367797 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69df\" (UniqueName: \"kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df\") pod \"redhat-marketplace-nvxtd\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.406395 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.414822 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.414868 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.414975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.420987 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.421141 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.445794 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.445905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7cpr\" (UniqueName: \"kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.445934 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.446021 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.446051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.447605 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.447657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.447883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.454484 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.465718 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7cpr\" (UniqueName: \"kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr\") pod \"controller-manager-6d4ddbbf5f-rmqxc\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.494320 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.511899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" event={"ID":"c14e937f-6acb-4f36-9432-eb77464ce9c9","Type":"ContainerStarted","Data":"0f2c6f88ce469eb7e57c5d85e0d55fb13db357c0a62249cf3806ad5da9508736"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.512006 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" event={"ID":"c14e937f-6acb-4f36-9432-eb77464ce9c9","Type":"ContainerStarted","Data":"b6ff529282dc7fcd9896d67594c69dc793412dd9bba799adc87e32cd96b40ab5"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.517921 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" event={"ID":"97b9dce4-e1bd-400e-a4c2-848e9703db45","Type":"ContainerStarted","Data":"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.517974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" event={"ID":"97b9dce4-e1bd-400e-a4c2-848e9703db45","Type":"ContainerStarted","Data":"1e33b8f987bb93b408f4eb2457682589f6a79902d41a193b38f0639969f25e4f"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.518933 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.521314 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerDied","Data":"f02016587b8ba1e3ccf46458c3752076907458feb5ab8d6ccf9c6b8063e296b4"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.521180 4744 generic.go:334] "Generic (PLEG): container finished" podID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerID="f02016587b8ba1e3ccf46458c3752076907458feb5ab8d6ccf9c6b8063e296b4" exitCode=0 Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.532649 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4rmzp" podStartSLOduration=12.532631279 podStartE2EDuration="12.532631279s" podCreationTimestamp="2026-03-11 00:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:16.529778212 +0000 UTC m=+253.333995817" watchObservedRunningTime="2026-03-11 00:58:16.532631279 +0000 UTC m=+253.336848874" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.542051 4744 generic.go:334] "Generic (PLEG): container finished" podID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerID="631b1eb69f9b4638de5ac0d6d82ee56d7f16a5b14b07fd079958d79533493fac" exitCode=0 Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.542221 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerDied","Data":"631b1eb69f9b4638de5ac0d6d82ee56d7f16a5b14b07fd079958d79533493fac"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.545602 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerStarted","Data":"1af9a53f908bfb9cc7404a73ae630bb10f87ed42c63b3e881e282449ebc0e473"} Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.546959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.547023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.584758 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.585114 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.590647 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" podStartSLOduration=191.590618286 podStartE2EDuration="3m11.590618286s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:16.568321658 +0000 UTC m=+253.372539263" watchObservedRunningTime="2026-03-11 00:58:16.590618286 +0000 UTC m=+253.394835891" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.602799 4744 patch_prober.go:28] interesting pod/console-f9d7485db-msd9d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.602876 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-msd9d" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.633058 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.654928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.662405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.665109 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.682561 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.743748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.847704 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.953589 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.968753 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume\") pod \"7886a95d-050e-4d58-baf1-e65310e95e4f\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.968867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zvc\" (UniqueName: \"kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc\") pod \"7886a95d-050e-4d58-baf1-e65310e95e4f\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.968912 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume\") pod \"7886a95d-050e-4d58-baf1-e65310e95e4f\" (UID: \"7886a95d-050e-4d58-baf1-e65310e95e4f\") " Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.970358 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7886a95d-050e-4d58-baf1-e65310e95e4f" (UID: "7886a95d-050e-4d58-baf1-e65310e95e4f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:16 crc kubenswrapper[4744]: W0311 00:58:16.975113 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c15dd40_8f1a_4be4_8801_c1d23566a6ec.slice/crio-4fd08c49dbbe53e85c0a3c3233e4fced5e25e0fed53cdee8c38475f1ac0328ae WatchSource:0}: Error finding container 4fd08c49dbbe53e85c0a3c3233e4fced5e25e0fed53cdee8c38475f1ac0328ae: Status 404 returned error can't find the container with id 4fd08c49dbbe53e85c0a3c3233e4fced5e25e0fed53cdee8c38475f1ac0328ae Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.983108 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc" (OuterVolumeSpecName: "kube-api-access-g8zvc") pod "7886a95d-050e-4d58-baf1-e65310e95e4f" (UID: "7886a95d-050e-4d58-baf1-e65310e95e4f"). InnerVolumeSpecName "kube-api-access-g8zvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:16 crc kubenswrapper[4744]: I0311 00:58:16.986378 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7886a95d-050e-4d58-baf1-e65310e95e4f" (UID: "7886a95d-050e-4d58-baf1-e65310e95e4f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.034284 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:58:17 crc kubenswrapper[4744]: W0311 00:58:17.057728 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce17bbe_ec69_4349_acbc_4e99fcfb894f.slice/crio-4f3a26561bc7016404142867234ef57100a2c8c3bc19a8eeb6817b0d7cd5c502 WatchSource:0}: Error finding container 4f3a26561bc7016404142867234ef57100a2c8c3bc19a8eeb6817b0d7cd5c502: Status 404 returned error can't find the container with id 4f3a26561bc7016404142867234ef57100a2c8c3bc19a8eeb6817b0d7cd5c502 Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.071464 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7886a95d-050e-4d58-baf1-e65310e95e4f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.071500 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zvc\" (UniqueName: \"kubernetes.io/projected/7886a95d-050e-4d58-baf1-e65310e95e4f-kube-api-access-g8zvc\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.071526 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7886a95d-050e-4d58-baf1-e65310e95e4f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.079451 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 00:58:17 crc kubenswrapper[4744]: W0311 00:58:17.083390 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5f27b816_d380_4c87_a7ac_2ef9005b712d.slice/crio-500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69 WatchSource:0}: Error finding container 500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69: Status 404 returned error can't find the container with id 500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69 Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.167012 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 00:58:17 crc kubenswrapper[4744]: E0311 00:58:17.167246 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7886a95d-050e-4d58-baf1-e65310e95e4f" containerName="collect-profiles" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.167259 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7886a95d-050e-4d58-baf1-e65310e95e4f" containerName="collect-profiles" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.167371 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7886a95d-050e-4d58-baf1-e65310e95e4f" containerName="collect-profiles" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.168109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.172704 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.177828 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.274623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.274744 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.274809 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdd2\" (UniqueName: \"kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.326570 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:17 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:17 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:17 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.326659 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.376370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.376438 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.376470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdd2\" (UniqueName: \"kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.377931 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.378170 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.411608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdd2\" (UniqueName: \"kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2\") pod \"redhat-operators-wr9vs\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.442357 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.442396 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.453781 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.519019 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.561105 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.562581 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.582389 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.589533 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5f27b816-d380-4c87-a7ac-2ef9005b712d","Type":"ContainerStarted","Data":"500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.608135 4744 generic.go:334] "Generic (PLEG): container finished" podID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerID="3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e" exitCode=0 Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.608216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerDied","Data":"3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.608247 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerStarted","Data":"4f3a26561bc7016404142867234ef57100a2c8c3bc19a8eeb6817b0d7cd5c502"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.611783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" event={"ID":"1c15dd40-8f1a-4be4-8801-c1d23566a6ec","Type":"ContainerStarted","Data":"68eef2c7cedda0dfd8d895dd1d52d290e7bab9e354a813d22973831e0c3a8191"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.611810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" event={"ID":"1c15dd40-8f1a-4be4-8801-c1d23566a6ec","Type":"ContainerStarted","Data":"4fd08c49dbbe53e85c0a3c3233e4fced5e25e0fed53cdee8c38475f1ac0328ae"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.612897 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.620289 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerID="9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1" exitCode=0 Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.620351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerDied","Data":"9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.630257 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.630776 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9" event={"ID":"7886a95d-050e-4d58-baf1-e65310e95e4f","Type":"ContainerDied","Data":"5d3aa7ce76e9b8ec4175eacfa5a162b45c45d11aa2824d20033cfd6fb7b29db1"} Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.630804 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3aa7ce76e9b8ec4175eacfa5a162b45c45d11aa2824d20033cfd6fb7b29db1" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.630836 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.634640 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-txbh2" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.639334 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" podStartSLOduration=5.6393155759999996 podStartE2EDuration="5.639315576s" podCreationTimestamp="2026-03-11 00:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:17.636500419 +0000 UTC m=+254.440718024" watchObservedRunningTime="2026-03-11 00:58:17.639315576 +0000 UTC m=+254.443533181" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.688074 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfcr\" (UniqueName: \"kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.688230 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.688264 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.789774 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skfcr\" (UniqueName: \"kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.790038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.790061 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.794740 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.795092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.830625 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfcr\" (UniqueName: \"kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr\") pod \"redhat-operators-6jf7l\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.902936 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.909896 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.930464 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:17 crc kubenswrapper[4744]: I0311 00:58:17.931697 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:17 crc kubenswrapper[4744]: W0311 00:58:17.982392 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb57e3e22_ee77_4a48_b62a_1a5ff5394362.slice/crio-d5211369217e613b95f128fddef57146760818d2d3c920a464cf55cf0910a3cd WatchSource:0}: Error finding container d5211369217e613b95f128fddef57146760818d2d3c920a464cf55cf0910a3cd: Status 404 returned error can't find the container with id d5211369217e613b95f128fddef57146760818d2d3c920a464cf55cf0910a3cd Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.007815 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.055375 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-7qgjq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.055447 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7qgjq" podUID="ea1125a6-231a-4f21-b9d6-8cdb2a51e482" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.067034 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-7qgjq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.067148 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7qgjq" podUID="ea1125a6-231a-4f21-b9d6-8cdb2a51e482" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.323977 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.327361 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:18 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:18 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:18 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.327542 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.329368 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:58:18 crc kubenswrapper[4744]: W0311 00:58:18.391440 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b5d764_9e1d_4be7_b365_85482c4e0def.slice/crio-ee5dc7a229a0c02d55a2b693776c3bdd528527fad3a6740c76561164873af473 WatchSource:0}: Error finding container ee5dc7a229a0c02d55a2b693776c3bdd528527fad3a6740c76561164873af473: Status 404 returned error can't find the container with id ee5dc7a229a0c02d55a2b693776c3bdd528527fad3a6740c76561164873af473 Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.417106 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.418219 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.427039 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.427262 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.430386 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.513724 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.513820 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.615959 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.616067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.616452 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.652161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.655026 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerID="5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9" exitCode=0 Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.655092 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerDied","Data":"5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9"} Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.655144 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerStarted","Data":"d5211369217e613b95f128fddef57146760818d2d3c920a464cf55cf0910a3cd"} Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.712122 4744 generic.go:334] "Generic (PLEG): container finished" podID="5f27b816-d380-4c87-a7ac-2ef9005b712d" containerID="b81b57b28fa1542df095004e3918f099ffe28daf920a4cc902077cf6a3f87816" exitCode=0 Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.712574 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5f27b816-d380-4c87-a7ac-2ef9005b712d","Type":"ContainerDied","Data":"b81b57b28fa1542df095004e3918f099ffe28daf920a4cc902077cf6a3f87816"} Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.723415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerStarted","Data":"ee5dc7a229a0c02d55a2b693776c3bdd528527fad3a6740c76561164873af473"} Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.735331 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h49lh" Mar 11 00:58:18 crc kubenswrapper[4744]: I0311 00:58:18.760067 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.344883 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:19 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:19 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:19 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.344960 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.410252 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.810602 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8d189add-4d6d-4799-9f03-f6683d1e95a5","Type":"ContainerStarted","Data":"918f1c49ef6675d81191c63e570ac1e3fba3120bdc68b69d7b0d1e030cd1d136"} Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.827486 4744 generic.go:334] "Generic (PLEG): container finished" podID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerID="89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d" exitCode=0 Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.827636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerDied","Data":"89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d"} Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.859789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.860130 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.860183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.863758 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.863802 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.877231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.877435 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.877729 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.905091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.929699 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.961585 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.961711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.965370 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.970322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:58:19 crc kubenswrapper[4744]: I0311 00:58:19.982329 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aeb1578-fe93-4bec-8f43-17d0923fa5c0-metrics-certs\") pod \"network-metrics-daemon-tdnf7\" (UID: \"7aeb1578-fe93-4bec-8f43-17d0923fa5c0\") " pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.082349 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.103241 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.116761 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.137269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.153831 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.160772 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnf7" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.163935 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access\") pod \"5f27b816-d380-4c87-a7ac-2ef9005b712d\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.164053 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir\") pod \"5f27b816-d380-4c87-a7ac-2ef9005b712d\" (UID: \"5f27b816-d380-4c87-a7ac-2ef9005b712d\") " Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.164175 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5f27b816-d380-4c87-a7ac-2ef9005b712d" (UID: "5f27b816-d380-4c87-a7ac-2ef9005b712d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.164501 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f27b816-d380-4c87-a7ac-2ef9005b712d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.169475 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5f27b816-d380-4c87-a7ac-2ef9005b712d" (UID: "5f27b816-d380-4c87-a7ac-2ef9005b712d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.265410 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f27b816-d380-4c87-a7ac-2ef9005b712d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.326363 4744 patch_prober.go:28] interesting pod/router-default-5444994796-9dcwq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 00:58:20 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Mar 11 00:58:20 crc kubenswrapper[4744]: [+]process-running ok Mar 11 00:58:20 crc kubenswrapper[4744]: healthz check failed Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.326438 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9dcwq" podUID="9b886295-15e1-4478-9ecb-ab71e77b99eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.491337 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rq8sg" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.865430 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5f27b816-d380-4c87-a7ac-2ef9005b712d","Type":"ContainerDied","Data":"500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69"} Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.865868 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="500b1fc03cca4ec4998333195fe3ecb4d7536df92487b4eb8fcba34dd559cd69" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.865696 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.884666 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d189add-4d6d-4799-9f03-f6683d1e95a5" containerID="a3dd9e960fab748327a6f7ba020c15eb2e93ed310da32de988edb38ad2d199ee" exitCode=0 Mar 11 00:58:20 crc kubenswrapper[4744]: I0311 00:58:20.885664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8d189add-4d6d-4799-9f03-f6683d1e95a5","Type":"ContainerDied","Data":"a3dd9e960fab748327a6f7ba020c15eb2e93ed310da32de988edb38ad2d199ee"} Mar 11 00:58:21 crc kubenswrapper[4744]: I0311 00:58:21.023675 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tdnf7"] Mar 11 00:58:21 crc kubenswrapper[4744]: I0311 00:58:21.155325 4744 ???:1] "http: TLS handshake error from 192.168.126.11:54564: no serving certificate available for the kubelet" Mar 11 00:58:21 crc kubenswrapper[4744]: I0311 00:58:21.326871 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:21 crc kubenswrapper[4744]: I0311 00:58:21.330588 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9dcwq" Mar 11 00:58:21 crc kubenswrapper[4744]: I0311 00:58:21.902377 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0e26319ff421eb8d2fb9ab7ffbe25f795151306057506cf27d9100235e1773e2"} Mar 11 00:58:23 crc kubenswrapper[4744]: I0311 00:58:23.124816 4744 ???:1] "http: TLS handshake error from 192.168.126.11:54568: no serving certificate available for the kubelet" Mar 11 00:58:26 crc kubenswrapper[4744]: I0311 00:58:26.597002 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:26 crc kubenswrapper[4744]: I0311 00:58:26.601769 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 00:58:28 crc kubenswrapper[4744]: I0311 00:58:28.071992 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7qgjq" Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.061031 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.061468 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerName="controller-manager" containerID="cri-o://68eef2c7cedda0dfd8d895dd1d52d290e7bab9e354a813d22973831e0c3a8191" gracePeriod=30 Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.069233 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.069654 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerName="route-controller-manager" containerID="cri-o://0c77f4b460433a157870b1e8fcc50a0e7ddc17993ec7d2a08e4d59cb594575a4" gracePeriod=30 Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.419568 4744 ???:1] "http: TLS handshake error from 192.168.126.11:39150: no serving certificate available for the kubelet" Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.992802 4744 generic.go:334] "Generic (PLEG): container finished" podID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerID="68eef2c7cedda0dfd8d895dd1d52d290e7bab9e354a813d22973831e0c3a8191" exitCode=0 Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.992870 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" event={"ID":"1c15dd40-8f1a-4be4-8801-c1d23566a6ec","Type":"ContainerDied","Data":"68eef2c7cedda0dfd8d895dd1d52d290e7bab9e354a813d22973831e0c3a8191"} Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.998045 4744 generic.go:334] "Generic (PLEG): container finished" podID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerID="0c77f4b460433a157870b1e8fcc50a0e7ddc17993ec7d2a08e4d59cb594575a4" exitCode=0 Mar 11 00:58:31 crc kubenswrapper[4744]: I0311 00:58:31.998113 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" event={"ID":"cce9002e-ff71-445e-96e6-2d0e7eae739a","Type":"ContainerDied","Data":"0c77f4b460433a157870b1e8fcc50a0e7ddc17993ec7d2a08e4d59cb594575a4"} Mar 11 00:58:34 crc kubenswrapper[4744]: I0311 00:58:34.639436 4744 patch_prober.go:28] interesting pod/route-controller-manager-5bccdc78db-64drh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 11 00:58:34 crc kubenswrapper[4744]: I0311 00:58:34.639775 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 11 00:58:35 crc kubenswrapper[4744]: W0311 00:58:35.320104 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-ab9e3e449fae0c6d0baed60f51362b92ca66a102fb542b9efd04f94af5b82128 WatchSource:0}: Error finding container ab9e3e449fae0c6d0baed60f51362b92ca66a102fb542b9efd04f94af5b82128: Status 404 returned error can't find the container with id ab9e3e449fae0c6d0baed60f51362b92ca66a102fb542b9efd04f94af5b82128 Mar 11 00:58:35 crc kubenswrapper[4744]: W0311 00:58:35.327203 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aeb1578_fe93_4bec_8f43_17d0923fa5c0.slice/crio-6d3e4f3d31aa6aa1647e5254521347202e711436443a9c9765fdf6735c8685fc WatchSource:0}: Error finding container 6d3e4f3d31aa6aa1647e5254521347202e711436443a9c9765fdf6735c8685fc: Status 404 returned error can't find the container with id 6d3e4f3d31aa6aa1647e5254521347202e711436443a9c9765fdf6735c8685fc Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.373734 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.437672 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access\") pod \"8d189add-4d6d-4799-9f03-f6683d1e95a5\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.437941 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir\") pod \"8d189add-4d6d-4799-9f03-f6683d1e95a5\" (UID: \"8d189add-4d6d-4799-9f03-f6683d1e95a5\") " Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.438032 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d189add-4d6d-4799-9f03-f6683d1e95a5" (UID: "8d189add-4d6d-4799-9f03-f6683d1e95a5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.438358 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d189add-4d6d-4799-9f03-f6683d1e95a5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.449239 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d189add-4d6d-4799-9f03-f6683d1e95a5" (UID: "8d189add-4d6d-4799-9f03-f6683d1e95a5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.540050 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d189add-4d6d-4799-9f03-f6683d1e95a5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:35 crc kubenswrapper[4744]: I0311 00:58:35.773476 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.025027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ab9e3e449fae0c6d0baed60f51362b92ca66a102fb542b9efd04f94af5b82128"} Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.027570 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.027554 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8d189add-4d6d-4799-9f03-f6683d1e95a5","Type":"ContainerDied","Data":"918f1c49ef6675d81191c63e570ac1e3fba3120bdc68b69d7b0d1e030cd1d136"} Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.027654 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918f1c49ef6675d81191c63e570ac1e3fba3120bdc68b69d7b0d1e030cd1d136" Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.028856 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" event={"ID":"7aeb1578-fe93-4bec-8f43-17d0923fa5c0","Type":"ContainerStarted","Data":"6d3e4f3d31aa6aa1647e5254521347202e711436443a9c9765fdf6735c8685fc"} Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.532489 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.532708 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 00:58:36 crc kubenswrapper[4744]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 11 00:58:36 crc kubenswrapper[4744]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rcql9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29553178-56m2v_openshift-infra(95897da0-81a7-4656-9787-808f64d7aa9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 11 00:58:36 crc kubenswrapper[4744]: > logger="UnhandledError" Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.533890 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29553178-56m2v" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.552211 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.552371 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 00:58:36 crc kubenswrapper[4744]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 11 00:58:36 crc kubenswrapper[4744]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dchkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29553176-m7tmh_openshift-infra(7c454621-190e-4962-abed-72c0ec0613de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 11 00:58:36 crc kubenswrapper[4744]: > logger="UnhandledError" Mar 11 00:58:36 crc kubenswrapper[4744]: E0311 00:58:36.553975 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" podUID="7c454621-190e-4962-abed-72c0ec0613de" Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.947061 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:36 crc kubenswrapper[4744]: I0311 00:58:36.952110 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.035732 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.035746 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh" event={"ID":"cce9002e-ff71-445e-96e6-2d0e7eae739a","Type":"ContainerDied","Data":"048419a197c455d19087d9c82dd7f096dafce676f617d8819bbc53ced8ff3d69"} Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.035854 4744 scope.go:117] "RemoveContainer" containerID="0c77f4b460433a157870b1e8fcc50a0e7ddc17993ec7d2a08e4d59cb594575a4" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.038417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5390934c60a55752eff3104b188752876972835a3d2c61130b9cd0b9cb88a6fb"} Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.040906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" event={"ID":"1c15dd40-8f1a-4be4-8801-c1d23566a6ec","Type":"ContainerDied","Data":"4fd08c49dbbe53e85c0a3c3233e4fced5e25e0fed53cdee8c38475f1ac0328ae"} Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.041027 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" Mar 11 00:58:37 crc kubenswrapper[4744]: E0311 00:58:37.041906 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29553178-56m2v" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" Mar 11 00:58:37 crc kubenswrapper[4744]: E0311 00:58:37.042752 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" podUID="7c454621-190e-4962-abed-72c0ec0613de" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config\") pod \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060533 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca\") pod \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060567 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert\") pod \"cce9002e-ff71-445e-96e6-2d0e7eae739a\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060632 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsthj\" (UniqueName: \"kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj\") pod \"cce9002e-ff71-445e-96e6-2d0e7eae739a\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca\") pod \"cce9002e-ff71-445e-96e6-2d0e7eae739a\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060692 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert\") pod \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060722 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7cpr\" (UniqueName: \"kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr\") pod \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060787 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles\") pod \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\" (UID: \"1c15dd40-8f1a-4be4-8801-c1d23566a6ec\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.060864 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config\") pod \"cce9002e-ff71-445e-96e6-2d0e7eae739a\" (UID: \"cce9002e-ff71-445e-96e6-2d0e7eae739a\") " Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.062296 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca" (OuterVolumeSpecName: "client-ca") pod "cce9002e-ff71-445e-96e6-2d0e7eae739a" (UID: "cce9002e-ff71-445e-96e6-2d0e7eae739a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.062320 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config" (OuterVolumeSpecName: "config") pod "cce9002e-ff71-445e-96e6-2d0e7eae739a" (UID: "cce9002e-ff71-445e-96e6-2d0e7eae739a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.062537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1c15dd40-8f1a-4be4-8801-c1d23566a6ec" (UID: "1c15dd40-8f1a-4be4-8801-c1d23566a6ec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.062759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config" (OuterVolumeSpecName: "config") pod "1c15dd40-8f1a-4be4-8801-c1d23566a6ec" (UID: "1c15dd40-8f1a-4be4-8801-c1d23566a6ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.064243 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c15dd40-8f1a-4be4-8801-c1d23566a6ec" (UID: "1c15dd40-8f1a-4be4-8801-c1d23566a6ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.066146 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj" (OuterVolumeSpecName: "kube-api-access-dsthj") pod "cce9002e-ff71-445e-96e6-2d0e7eae739a" (UID: "cce9002e-ff71-445e-96e6-2d0e7eae739a"). InnerVolumeSpecName "kube-api-access-dsthj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.066278 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c15dd40-8f1a-4be4-8801-c1d23566a6ec" (UID: "1c15dd40-8f1a-4be4-8801-c1d23566a6ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.068217 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr" (OuterVolumeSpecName: "kube-api-access-r7cpr") pod "1c15dd40-8f1a-4be4-8801-c1d23566a6ec" (UID: "1c15dd40-8f1a-4be4-8801-c1d23566a6ec"). InnerVolumeSpecName "kube-api-access-r7cpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.079676 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cce9002e-ff71-445e-96e6-2d0e7eae739a" (UID: "cce9002e-ff71-445e-96e6-2d0e7eae739a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162381 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162423 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162434 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7cpr\" (UniqueName: \"kubernetes.io/projected/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-kube-api-access-r7cpr\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162445 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162455 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce9002e-ff71-445e-96e6-2d0e7eae739a-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162463 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162470 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c15dd40-8f1a-4be4-8801-c1d23566a6ec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162481 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce9002e-ff71-445e-96e6-2d0e7eae739a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.162491 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsthj\" (UniqueName: \"kubernetes.io/projected/cce9002e-ff71-445e-96e6-2d0e7eae739a-kube-api-access-dsthj\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.363135 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.365922 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bccdc78db-64drh"] Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.375900 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.379657 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc"] Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.634102 4744 patch_prober.go:28] interesting pod/controller-manager-6d4ddbbf5f-rmqxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: i/o timeout" start-of-body= Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.634174 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d4ddbbf5f-rmqxc" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: i/o timeout" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.982243 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" path="/var/lib/kubelet/pods/1c15dd40-8f1a-4be4-8801-c1d23566a6ec/volumes" Mar 11 00:58:37 crc kubenswrapper[4744]: I0311 00:58:37.982922 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" path="/var/lib/kubelet/pods/cce9002e-ff71-445e-96e6-2d0e7eae739a/volumes" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.285296 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:38 crc kubenswrapper[4744]: E0311 00:58:38.288993 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d189add-4d6d-4799-9f03-f6683d1e95a5" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289025 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d189add-4d6d-4799-9f03-f6683d1e95a5" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: E0311 00:58:38.289041 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f27b816-d380-4c87-a7ac-2ef9005b712d" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289049 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f27b816-d380-4c87-a7ac-2ef9005b712d" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: E0311 00:58:38.289060 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerName="controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289066 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerName="controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: E0311 00:58:38.289076 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerName="route-controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289467 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerName="route-controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289604 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d189add-4d6d-4799-9f03-f6683d1e95a5" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289614 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f27b816-d380-4c87-a7ac-2ef9005b712d" containerName="pruner" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289622 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce9002e-ff71-445e-96e6-2d0e7eae739a" containerName="route-controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.289631 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c15dd40-8f1a-4be4-8801-c1d23566a6ec" containerName="controller-manager" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.290342 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.294414 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.299119 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.301854 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.302308 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.302527 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.302734 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.308807 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.310262 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.313114 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.316429 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.316824 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.316857 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.316995 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.318372 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.318642 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.326181 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.330041 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382033 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvvhr\" (UniqueName: \"kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382110 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkglg\" (UniqueName: \"kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382311 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382393 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.382431 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.383532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.383583 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.491347 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.491765 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.491822 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.491894 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvvhr\" (UniqueName: \"kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.491966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkglg\" (UniqueName: \"kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.492078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.492133 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.492169 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.492250 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.493740 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.496851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.498460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.498586 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.504926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.506265 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.509034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvvhr\" (UniqueName: \"kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.512834 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkglg\" (UniqueName: \"kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg\") pod \"controller-manager-7cc9895f8f-mm6b7\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.519231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert\") pod \"route-controller-manager-677589fdb8-qqrb8\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.639987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:38 crc kubenswrapper[4744]: I0311 00:58:38.650197 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:42 crc kubenswrapper[4744]: I0311 00:58:42.083247 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d83a64e-bd7c-43b4-aac4-8fdc807059f5" containerID="d4e3bb5b14629021056ad0f4d46c236a1c2f2aab672c6d18f7df9c13e7fad9d8" exitCode=0 Mar 11 00:58:42 crc kubenswrapper[4744]: I0311 00:58:42.083361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29553120-6zw87" event={"ID":"8d83a64e-bd7c-43b4-aac4-8fdc807059f5","Type":"ContainerDied","Data":"d4e3bb5b14629021056ad0f4d46c236a1c2f2aab672c6d18f7df9c13e7fad9d8"} Mar 11 00:58:42 crc kubenswrapper[4744]: I0311 00:58:42.410750 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 00:58:42 crc kubenswrapper[4744]: I0311 00:58:42.410850 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 00:58:45 crc kubenswrapper[4744]: I0311 00:58:45.957327 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.356587 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.357106 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tx9dj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kk7lx_openshift-marketplace(9c559a48-ac87-4cab-848d-f2f647f8396b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.359619 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kk7lx" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.430414 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.430597 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbm2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qqw9j_openshift-marketplace(dc99af5e-bf41-49ac-8e4a-416f565cbfc9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.431326 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.431423 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tdd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wr9vs_openshift-marketplace(b57e3e22-ee77-4a48-b62a-1a5ff5394362): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.432649 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qqw9j" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" Mar 11 00:58:47 crc kubenswrapper[4744]: E0311 00:58:47.432677 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wr9vs" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" Mar 11 00:58:47 crc kubenswrapper[4744]: I0311 00:58:47.716777 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56d4r" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.591306 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kk7lx" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.591487 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wr9vs" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.591556 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qqw9j" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.614206 4744 scope.go:117] "RemoveContainer" containerID="68eef2c7cedda0dfd8d895dd1d52d290e7bab9e354a813d22973831e0c3a8191" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.716892 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.717044 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kctvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-n9qr2_openshift-marketplace(a870760b-88e5-4526-8f91-ef89201e2a13): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.718148 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-n9qr2" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.783534 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.784054 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbsj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9zdvr_openshift-marketplace(19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 00:58:49 crc kubenswrapper[4744]: E0311 00:58:49.785102 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9zdvr" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.790022 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.886623 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhzxz\" (UniqueName: \"kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz\") pod \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.886688 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca\") pod \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\" (UID: \"8d83a64e-bd7c-43b4-aac4-8fdc807059f5\") " Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.889383 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca" (OuterVolumeSpecName: "serviceca") pod "8d83a64e-bd7c-43b4-aac4-8fdc807059f5" (UID: "8d83a64e-bd7c-43b4-aac4-8fdc807059f5"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.895777 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz" (OuterVolumeSpecName: "kube-api-access-mhzxz") pod "8d83a64e-bd7c-43b4-aac4-8fdc807059f5" (UID: "8d83a64e-bd7c-43b4-aac4-8fdc807059f5"). InnerVolumeSpecName "kube-api-access-mhzxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.946271 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.988580 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhzxz\" (UniqueName: \"kubernetes.io/projected/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-kube-api-access-mhzxz\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.988652 4744 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d83a64e-bd7c-43b4-aac4-8fdc807059f5-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:49 crc kubenswrapper[4744]: I0311 00:58:49.993580 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.130456 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerStarted","Data":"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.131388 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" event={"ID":"552c19f6-b704-4e27-b31d-2e6a091d1b06","Type":"ContainerStarted","Data":"fccfe8befa9c08cd529643c21f4f7cbcfdd58f6d3267e9304a7823519e20db2a"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.132647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29553120-6zw87" event={"ID":"8d83a64e-bd7c-43b4-aac4-8fdc807059f5","Type":"ContainerDied","Data":"06aa8cf535683f95557ed9ec3a36658a240464fedb09c1198012477ffe68b408"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.132685 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06aa8cf535683f95557ed9ec3a36658a240464fedb09c1198012477ffe68b408" Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.132740 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29553120-6zw87" Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.137061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerStarted","Data":"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.138557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerStarted","Data":"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.141608 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cd849d95f350355dee0a1a0c0a8616f46a6340f241a1038108e018e7f78b0ed7"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.143650 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d7380e0c2a1dfac2f7a3147766890cca38b812a24f1987f6c5085c10685ccd26"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.144577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f0ffc828941ad05daed1da5a5e6ae34327aad36ba34959e9f16e6499232ba25e"} Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.144926 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:58:50 crc kubenswrapper[4744]: I0311 00:58:50.148953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" event={"ID":"7aeb1578-fe93-4bec-8f43-17d0923fa5c0","Type":"ContainerStarted","Data":"c70b69a4ab7f6b6a321f504247f0a46fd4213964a64da2a3d980b054c4114560"} Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.004374 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.101539 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:51 crc kubenswrapper[4744]: E0311 00:58:51.133365 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-n9qr2" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" Mar 11 00:58:51 crc kubenswrapper[4744]: E0311 00:58:51.134168 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9zdvr" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.172271 4744 generic.go:334] "Generic (PLEG): container finished" podID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerID="10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6" exitCode=0 Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.172467 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerDied","Data":"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6"} Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.176579 4744 generic.go:334] "Generic (PLEG): container finished" podID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerID="5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e" exitCode=0 Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.176929 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerDied","Data":"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e"} Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.185949 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerID="e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4" exitCode=0 Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.186011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerDied","Data":"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4"} Mar 11 00:58:51 crc kubenswrapper[4744]: I0311 00:58:51.192994 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" event={"ID":"2790408a-d33b-4746-bab6-7a3f131802f0","Type":"ContainerStarted","Data":"4c08bd8a6bd9c558a5e96f7eb3659b93417b635e55b581f11ea58cab8cea115d"} Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.199100 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnf7" event={"ID":"7aeb1578-fe93-4bec-8f43-17d0923fa5c0","Type":"ContainerStarted","Data":"6eeb767f7a1fa261cec04d8df57478598434ac982fd9e03befd70d55ce245594"} Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.200617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553178-56m2v" event={"ID":"95897da0-81a7-4656-9787-808f64d7aa9d","Type":"ContainerStarted","Data":"f4500f053249a9107662e0962cc87e100738c7ccadb7e113b4acb2d44153988f"} Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.202216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" event={"ID":"2790408a-d33b-4746-bab6-7a3f131802f0","Type":"ContainerStarted","Data":"f627eb963b026b9ed0c404ffcb57c787e09a1379cbf73df802a25cc7cb862a78"} Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.202290 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" podUID="2790408a-d33b-4746-bab6-7a3f131802f0" containerName="controller-manager" containerID="cri-o://f627eb963b026b9ed0c404ffcb57c787e09a1379cbf73df802a25cc7cb862a78" gracePeriod=30 Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.202406 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.204613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" event={"ID":"552c19f6-b704-4e27-b31d-2e6a091d1b06","Type":"ContainerStarted","Data":"e3887719f8b01941812abe7807617bca905de3fd7169e6d57f4d7726f3a90bb2"} Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.204746 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" podUID="552c19f6-b704-4e27-b31d-2e6a091d1b06" containerName="route-controller-manager" containerID="cri-o://e3887719f8b01941812abe7807617bca905de3fd7169e6d57f4d7726f3a90bb2" gracePeriod=30 Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.204851 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.208313 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.214320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.219333 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tdnf7" podStartSLOduration=227.219315508 podStartE2EDuration="3m47.219315508s" podCreationTimestamp="2026-03-11 00:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:52.217887445 +0000 UTC m=+289.022105060" watchObservedRunningTime="2026-03-11 00:58:52.219315508 +0000 UTC m=+289.023533113" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.235361 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553178-56m2v" podStartSLOduration=10.171772153 podStartE2EDuration="52.235342299s" podCreationTimestamp="2026-03-11 00:58:00 +0000 UTC" firstStartedPulling="2026-03-11 00:58:09.116016974 +0000 UTC m=+245.920234579" lastFinishedPulling="2026-03-11 00:58:51.17958707 +0000 UTC m=+287.983804725" observedRunningTime="2026-03-11 00:58:52.231328389 +0000 UTC m=+289.035545984" watchObservedRunningTime="2026-03-11 00:58:52.235342299 +0000 UTC m=+289.039559904" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.253974 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" podStartSLOduration=21.253956649 podStartE2EDuration="21.253956649s" podCreationTimestamp="2026-03-11 00:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:52.2529284 +0000 UTC m=+289.057146005" watchObservedRunningTime="2026-03-11 00:58:52.253956649 +0000 UTC m=+289.058174254" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.341769 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" podStartSLOduration=21.341754082 podStartE2EDuration="21.341754082s" podCreationTimestamp="2026-03-11 00:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:52.340644591 +0000 UTC m=+289.144862196" watchObservedRunningTime="2026-03-11 00:58:52.341754082 +0000 UTC m=+289.145971687" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.368520 4744 csr.go:261] certificate signing request csr-8zb6x is approved, waiting to be issued Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.372231 4744 csr.go:257] certificate signing request csr-8zb6x is issued Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.604971 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 00:58:52 crc kubenswrapper[4744]: E0311 00:58:52.605205 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d83a64e-bd7c-43b4-aac4-8fdc807059f5" containerName="image-pruner" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.605217 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d83a64e-bd7c-43b4-aac4-8fdc807059f5" containerName="image-pruner" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.605304 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d83a64e-bd7c-43b4-aac4-8fdc807059f5" containerName="image-pruner" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.605655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.608649 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.608884 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.611364 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.724285 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.724460 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.826572 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.826747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.826899 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.855825 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:52 crc kubenswrapper[4744]: I0311 00:58:52.929248 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.215009 4744 generic.go:334] "Generic (PLEG): container finished" podID="2790408a-d33b-4746-bab6-7a3f131802f0" containerID="f627eb963b026b9ed0c404ffcb57c787e09a1379cbf73df802a25cc7cb862a78" exitCode=0 Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.215371 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" event={"ID":"2790408a-d33b-4746-bab6-7a3f131802f0","Type":"ContainerDied","Data":"f627eb963b026b9ed0c404ffcb57c787e09a1379cbf73df802a25cc7cb862a78"} Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.217076 4744 generic.go:334] "Generic (PLEG): container finished" podID="552c19f6-b704-4e27-b31d-2e6a091d1b06" containerID="e3887719f8b01941812abe7807617bca905de3fd7169e6d57f4d7726f3a90bb2" exitCode=0 Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.217113 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" event={"ID":"552c19f6-b704-4e27-b31d-2e6a091d1b06","Type":"ContainerDied","Data":"e3887719f8b01941812abe7807617bca905de3fd7169e6d57f4d7726f3a90bb2"} Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.220675 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerStarted","Data":"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641"} Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.222403 4744 generic.go:334] "Generic (PLEG): container finished" podID="95897da0-81a7-4656-9787-808f64d7aa9d" containerID="f4500f053249a9107662e0962cc87e100738c7ccadb7e113b4acb2d44153988f" exitCode=0 Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.222440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553178-56m2v" event={"ID":"95897da0-81a7-4656-9787-808f64d7aa9d","Type":"ContainerDied","Data":"f4500f053249a9107662e0962cc87e100738c7ccadb7e113b4acb2d44153988f"} Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.246013 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nvxtd" podStartSLOduration=2.4905308 podStartE2EDuration="37.245990855s" podCreationTimestamp="2026-03-11 00:58:16 +0000 UTC" firstStartedPulling="2026-03-11 00:58:17.619744743 +0000 UTC m=+254.423962338" lastFinishedPulling="2026-03-11 00:58:52.375204788 +0000 UTC m=+289.179422393" observedRunningTime="2026-03-11 00:58:53.24238405 +0000 UTC m=+290.046601655" watchObservedRunningTime="2026-03-11 00:58:53.245990855 +0000 UTC m=+290.050208480" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.373266 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 00:41:17.973307406 +0000 UTC Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.373307 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7103h42m24.600004121s for next certificate rotation Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.437188 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.437602 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.507728 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:58:53 crc kubenswrapper[4744]: E0311 00:58:53.508111 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2790408a-d33b-4746-bab6-7a3f131802f0" containerName="controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.508204 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2790408a-d33b-4746-bab6-7a3f131802f0" containerName="controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: E0311 00:58:53.508262 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552c19f6-b704-4e27-b31d-2e6a091d1b06" containerName="route-controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.508313 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="552c19f6-b704-4e27-b31d-2e6a091d1b06" containerName="route-controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.508455 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2790408a-d33b-4746-bab6-7a3f131802f0" containerName="controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.509045 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="552c19f6-b704-4e27-b31d-2e6a091d1b06" containerName="route-controller-manager" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.512859 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.515832 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542192 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config\") pod \"2790408a-d33b-4746-bab6-7a3f131802f0\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542285 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkglg\" (UniqueName: \"kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg\") pod \"2790408a-d33b-4746-bab6-7a3f131802f0\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles\") pod \"2790408a-d33b-4746-bab6-7a3f131802f0\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542359 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert\") pod \"552c19f6-b704-4e27-b31d-2e6a091d1b06\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542396 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert\") pod \"2790408a-d33b-4746-bab6-7a3f131802f0\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca\") pod \"2790408a-d33b-4746-bab6-7a3f131802f0\" (UID: \"2790408a-d33b-4746-bab6-7a3f131802f0\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvvhr\" (UniqueName: \"kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr\") pod \"552c19f6-b704-4e27-b31d-2e6a091d1b06\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542499 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config\") pod \"552c19f6-b704-4e27-b31d-2e6a091d1b06\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.542545 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca\") pod \"552c19f6-b704-4e27-b31d-2e6a091d1b06\" (UID: \"552c19f6-b704-4e27-b31d-2e6a091d1b06\") " Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.543442 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config" (OuterVolumeSpecName: "config") pod "2790408a-d33b-4746-bab6-7a3f131802f0" (UID: "2790408a-d33b-4746-bab6-7a3f131802f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.543640 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "2790408a-d33b-4746-bab6-7a3f131802f0" (UID: "2790408a-d33b-4746-bab6-7a3f131802f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.543976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2790408a-d33b-4746-bab6-7a3f131802f0" (UID: "2790408a-d33b-4746-bab6-7a3f131802f0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.544764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca" (OuterVolumeSpecName: "client-ca") pod "552c19f6-b704-4e27-b31d-2e6a091d1b06" (UID: "552c19f6-b704-4e27-b31d-2e6a091d1b06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.544766 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config" (OuterVolumeSpecName: "config") pod "552c19f6-b704-4e27-b31d-2e6a091d1b06" (UID: "552c19f6-b704-4e27-b31d-2e6a091d1b06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.547404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr" (OuterVolumeSpecName: "kube-api-access-rvvhr") pod "552c19f6-b704-4e27-b31d-2e6a091d1b06" (UID: "552c19f6-b704-4e27-b31d-2e6a091d1b06"). InnerVolumeSpecName "kube-api-access-rvvhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.547501 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg" (OuterVolumeSpecName: "kube-api-access-pkglg") pod "2790408a-d33b-4746-bab6-7a3f131802f0" (UID: "2790408a-d33b-4746-bab6-7a3f131802f0"). InnerVolumeSpecName "kube-api-access-pkglg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.547603 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2790408a-d33b-4746-bab6-7a3f131802f0" (UID: "2790408a-d33b-4746-bab6-7a3f131802f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.548781 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "552c19f6-b704-4e27-b31d-2e6a091d1b06" (UID: "552c19f6-b704-4e27-b31d-2e6a091d1b06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.643624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87wr\" (UniqueName: \"kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.643903 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.643947 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.643968 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.643983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644043 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644055 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkglg\" (UniqueName: \"kubernetes.io/projected/2790408a-d33b-4746-bab6-7a3f131802f0-kube-api-access-pkglg\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644065 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644073 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/552c19f6-b704-4e27-b31d-2e6a091d1b06-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644083 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790408a-d33b-4746-bab6-7a3f131802f0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644091 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2790408a-d33b-4746-bab6-7a3f131802f0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644099 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvvhr\" (UniqueName: \"kubernetes.io/projected/552c19f6-b704-4e27-b31d-2e6a091d1b06-kube-api-access-rvvhr\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644108 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.644116 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/552c19f6-b704-4e27-b31d-2e6a091d1b06-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.745281 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.745657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.745680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.745727 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87wr\" (UniqueName: \"kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.745763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.746751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.747497 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.747622 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.750792 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.768216 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87wr\" (UniqueName: \"kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr\") pod \"controller-manager-84b5546df4-ckw47\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:53 crc kubenswrapper[4744]: I0311 00:58:53.843133 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.005352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 00:58:54 crc kubenswrapper[4744]: W0311 00:58:54.021104 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podea7cd693_24e8_47b0_922e_ebf91592b69f.slice/crio-0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f WatchSource:0}: Error finding container 0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f: Status 404 returned error can't find the container with id 0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.221253 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.228870 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.229256 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7" event={"ID":"2790408a-d33b-4746-bab6-7a3f131802f0","Type":"ContainerDied","Data":"4c08bd8a6bd9c558a5e96f7eb3659b93417b635e55b581f11ea58cab8cea115d"} Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.229299 4744 scope.go:117] "RemoveContainer" containerID="f627eb963b026b9ed0c404ffcb57c787e09a1379cbf73df802a25cc7cb862a78" Mar 11 00:58:54 crc kubenswrapper[4744]: W0311 00:58:54.229740 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0cbcba_6518_4fdf_be7f_ec2ec7dbbaf6.slice/crio-3b6fc0f95a612b2dbc8b1b25caaf7862d5f634d4e7850d2cb995d76f41eae72d WatchSource:0}: Error finding container 3b6fc0f95a612b2dbc8b1b25caaf7862d5f634d4e7850d2cb995d76f41eae72d: Status 404 returned error can't find the container with id 3b6fc0f95a612b2dbc8b1b25caaf7862d5f634d4e7850d2cb995d76f41eae72d Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.231960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" event={"ID":"552c19f6-b704-4e27-b31d-2e6a091d1b06","Type":"ContainerDied","Data":"fccfe8befa9c08cd529643c21f4f7cbcfdd58f6d3267e9304a7823519e20db2a"} Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.231973 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.235166 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerStarted","Data":"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148"} Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.235882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea7cd693-24e8-47b0-922e-ebf91592b69f","Type":"ContainerStarted","Data":"0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f"} Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.283808 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.286550 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677589fdb8-qqrb8"] Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.297611 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.299811 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cc9895f8f-mm6b7"] Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.373990 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-17 08:53:06.875073938 +0000 UTC Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.374021 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7495h54m12.501055751s for next certificate rotation Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.465339 4744 scope.go:117] "RemoveContainer" containerID="e3887719f8b01941812abe7807617bca905de3fd7169e6d57f4d7726f3a90bb2" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.505273 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.655804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcql9\" (UniqueName: \"kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9\") pod \"95897da0-81a7-4656-9787-808f64d7aa9d\" (UID: \"95897da0-81a7-4656-9787-808f64d7aa9d\") " Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.662414 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9" (OuterVolumeSpecName: "kube-api-access-rcql9") pod "95897da0-81a7-4656-9787-808f64d7aa9d" (UID: "95897da0-81a7-4656-9787-808f64d7aa9d"). InnerVolumeSpecName "kube-api-access-rcql9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:54 crc kubenswrapper[4744]: I0311 00:58:54.758296 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcql9\" (UniqueName: \"kubernetes.io/projected/95897da0-81a7-4656-9787-808f64d7aa9d-kube-api-access-rcql9\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.245673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" event={"ID":"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6","Type":"ContainerStarted","Data":"3b6fc0f95a612b2dbc8b1b25caaf7862d5f634d4e7850d2cb995d76f41eae72d"} Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.247653 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553178-56m2v" event={"ID":"95897da0-81a7-4656-9787-808f64d7aa9d","Type":"ContainerDied","Data":"e06843b6821d9669c3d6d83536b36f8e8c1958a0229a416209e722e09fa6e113"} Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.247678 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e06843b6821d9669c3d6d83536b36f8e8c1958a0229a416209e722e09fa6e113" Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.247712 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553178-56m2v" Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.268184 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6jf7l" podStartSLOduration=4.454889325 podStartE2EDuration="38.268162049s" podCreationTimestamp="2026-03-11 00:58:17 +0000 UTC" firstStartedPulling="2026-03-11 00:58:19.830327859 +0000 UTC m=+256.634545464" lastFinishedPulling="2026-03-11 00:58:53.643600583 +0000 UTC m=+290.447818188" observedRunningTime="2026-03-11 00:58:55.26554202 +0000 UTC m=+292.069759626" watchObservedRunningTime="2026-03-11 00:58:55.268162049 +0000 UTC m=+292.072379674" Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.981902 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2790408a-d33b-4746-bab6-7a3f131802f0" path="/var/lib/kubelet/pods/2790408a-d33b-4746-bab6-7a3f131802f0/volumes" Mar 11 00:58:55 crc kubenswrapper[4744]: I0311 00:58:55.982922 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552c19f6-b704-4e27-b31d-2e6a091d1b06" path="/var/lib/kubelet/pods/552c19f6-b704-4e27-b31d-2e6a091d1b06/volumes" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.255227 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea7cd693-24e8-47b0-922e-ebf91592b69f" containerID="fac5eadcc5399e1f7a61d226b81c744061008a605f2392c10c0b4e662541600a" exitCode=0 Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.255493 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea7cd693-24e8-47b0-922e-ebf91592b69f","Type":"ContainerDied","Data":"fac5eadcc5399e1f7a61d226b81c744061008a605f2392c10c0b4e662541600a"} Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.257810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerStarted","Data":"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8"} Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.258864 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" event={"ID":"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6","Type":"ContainerStarted","Data":"67ef97523c51d73e79e5d0e53184c3ea1b093c3a361d11977b27894fae53a3b3"} Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.259126 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.263502 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.293496 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:58:56 crc kubenswrapper[4744]: E0311 00:58:56.293859 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" containerName="oc" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.293872 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" containerName="oc" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.294012 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" containerName="oc" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.294381 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.295420 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pfjjz" podStartSLOduration=3.761933555 podStartE2EDuration="41.295397667s" podCreationTimestamp="2026-03-11 00:58:15 +0000 UTC" firstStartedPulling="2026-03-11 00:58:17.633359132 +0000 UTC m=+254.437576747" lastFinishedPulling="2026-03-11 00:58:55.166823254 +0000 UTC m=+291.971040859" observedRunningTime="2026-03-11 00:58:56.293552408 +0000 UTC m=+293.097770013" watchObservedRunningTime="2026-03-11 00:58:56.295397667 +0000 UTC m=+293.099615272" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.297581 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.297730 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.297743 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.297875 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.299254 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.309396 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.314700 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.368810 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" podStartSLOduration=5.368777692 podStartE2EDuration="5.368777692s" podCreationTimestamp="2026-03-11 00:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:56.334654621 +0000 UTC m=+293.138872226" watchObservedRunningTime="2026-03-11 00:58:56.368777692 +0000 UTC m=+293.172995287" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.382494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.382776 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.382891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.382970 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.484206 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.484248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.484283 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.484330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.485743 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.485774 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.494656 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.494686 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.501085 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.501723 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert\") pod \"route-controller-manager-bf64b8966-dgsll\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.609153 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:56 crc kubenswrapper[4744]: I0311 00:58:56.873284 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.265828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" event={"ID":"1352afda-75ac-4f8a-af81-28cb34d11373","Type":"ContainerStarted","Data":"2ae15322d2f83fc05b4f04924512996418b2e3f9cf00d651d3519297adcaf6c3"} Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.266112 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" event={"ID":"1352afda-75ac-4f8a-af81-28cb34d11373","Type":"ContainerStarted","Data":"31eb3785b0ab1b00f6d0e4b1b6162d7ed15a97f0c475578bb354ecf2bc293966"} Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.266132 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.267743 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c454621-190e-4962-abed-72c0ec0613de" containerID="f345e8a8f84ce6cefafb28ca43e1fdf81eff4b948e882887f03dc2198bb68c8d" exitCode=0 Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.268291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" event={"ID":"7c454621-190e-4962-abed-72c0ec0613de","Type":"ContainerDied","Data":"f345e8a8f84ce6cefafb28ca43e1fdf81eff4b948e882887f03dc2198bb68c8d"} Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.284953 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" podStartSLOduration=6.28493929 podStartE2EDuration="6.28493929s" podCreationTimestamp="2026-03-11 00:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:58:57.284328637 +0000 UTC m=+294.088546242" watchObservedRunningTime="2026-03-11 00:58:57.28493929 +0000 UTC m=+294.089156885" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.403665 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.615353 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.623643 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nvxtd" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="registry-server" probeResult="failure" output=< Mar 11 00:58:57 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 00:58:57 crc kubenswrapper[4744]: > Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.707926 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir\") pod \"ea7cd693-24e8-47b0-922e-ebf91592b69f\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.708244 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access\") pod \"ea7cd693-24e8-47b0-922e-ebf91592b69f\" (UID: \"ea7cd693-24e8-47b0-922e-ebf91592b69f\") " Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.708053 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ea7cd693-24e8-47b0-922e-ebf91592b69f" (UID: "ea7cd693-24e8-47b0-922e-ebf91592b69f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.708561 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea7cd693-24e8-47b0-922e-ebf91592b69f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.716045 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ea7cd693-24e8-47b0-922e-ebf91592b69f" (UID: "ea7cd693-24e8-47b0-922e-ebf91592b69f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.809378 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea7cd693-24e8-47b0-922e-ebf91592b69f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.904794 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:57 crc kubenswrapper[4744]: I0311 00:58:57.904840 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.275265 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.275315 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea7cd693-24e8-47b0-922e-ebf91592b69f","Type":"ContainerDied","Data":"0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f"} Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.275382 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6f51aaa47027b2253a4cbf6980dc8104c0fcdb428a851aa35600abf9f2b91f" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.405365 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 00:58:58 crc kubenswrapper[4744]: E0311 00:58:58.405854 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7cd693-24e8-47b0-922e-ebf91592b69f" containerName="pruner" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.405885 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7cd693-24e8-47b0-922e-ebf91592b69f" containerName="pruner" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.406087 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7cd693-24e8-47b0-922e-ebf91592b69f" containerName="pruner" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.406820 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.411990 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.420927 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.428468 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.519576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.519838 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.519906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.566667 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.621376 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.621460 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.621502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.621652 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.621700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.649610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access\") pod \"installer-9-crc\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.722334 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dchkc\" (UniqueName: \"kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc\") pod \"7c454621-190e-4962-abed-72c0ec0613de\" (UID: \"7c454621-190e-4962-abed-72c0ec0613de\") " Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.728954 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc" (OuterVolumeSpecName: "kube-api-access-dchkc") pod "7c454621-190e-4962-abed-72c0ec0613de" (UID: "7c454621-190e-4962-abed-72c0ec0613de"). InnerVolumeSpecName "kube-api-access-dchkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.741140 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.824378 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dchkc\" (UniqueName: \"kubernetes.io/projected/7c454621-190e-4962-abed-72c0ec0613de-kube-api-access-dchkc\") on node \"crc\" DevicePath \"\"" Mar 11 00:58:58 crc kubenswrapper[4744]: I0311 00:58:58.950264 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6jf7l" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="registry-server" probeResult="failure" output=< Mar 11 00:58:58 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 00:58:58 crc kubenswrapper[4744]: > Mar 11 00:58:59 crc kubenswrapper[4744]: I0311 00:58:59.171812 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 00:58:59 crc kubenswrapper[4744]: W0311 00:58:59.184874 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod516c0d23_985b_4de5_9b7c_c7651922d5d1.slice/crio-ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99 WatchSource:0}: Error finding container ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99: Status 404 returned error can't find the container with id ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99 Mar 11 00:58:59 crc kubenswrapper[4744]: I0311 00:58:59.286527 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"516c0d23-985b-4de5-9b7c-c7651922d5d1","Type":"ContainerStarted","Data":"ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99"} Mar 11 00:58:59 crc kubenswrapper[4744]: I0311 00:58:59.289009 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" event={"ID":"7c454621-190e-4962-abed-72c0ec0613de","Type":"ContainerDied","Data":"48a9bc05b7bcbb0d30ebd3c256d0b20a51f77b6070ef776ebfcca242dcc7d7c1"} Mar 11 00:58:59 crc kubenswrapper[4744]: I0311 00:58:59.289061 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a9bc05b7bcbb0d30ebd3c256d0b20a51f77b6070ef776ebfcca242dcc7d7c1" Mar 11 00:58:59 crc kubenswrapper[4744]: I0311 00:58:59.289016 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553176-m7tmh" Mar 11 00:59:00 crc kubenswrapper[4744]: I0311 00:59:00.296998 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"516c0d23-985b-4de5-9b7c-c7651922d5d1","Type":"ContainerStarted","Data":"7821ae34742746b27397c69d882161b112c63a34c60bd33d3db066b2a792afdf"} Mar 11 00:59:00 crc kubenswrapper[4744]: I0311 00:59:00.332095 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.332068386 podStartE2EDuration="2.332068386s" podCreationTimestamp="2026-03-11 00:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:00.326964525 +0000 UTC m=+297.131182170" watchObservedRunningTime="2026-03-11 00:59:00.332068386 +0000 UTC m=+297.136286031" Mar 11 00:59:03 crc kubenswrapper[4744]: I0311 00:59:03.318772 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerStarted","Data":"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2"} Mar 11 00:59:03 crc kubenswrapper[4744]: I0311 00:59:03.321851 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerID="f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef" exitCode=0 Mar 11 00:59:03 crc kubenswrapper[4744]: I0311 00:59:03.321891 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerDied","Data":"f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef"} Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.339721 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerStarted","Data":"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c"} Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.341991 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerID="90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2" exitCode=0 Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.342046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerDied","Data":"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2"} Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.347497 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerStarted","Data":"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91"} Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.378567 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.379076 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:59:04 crc kubenswrapper[4744]: I0311 00:59:04.419411 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk7lx" podStartSLOduration=3.138079784 podStartE2EDuration="51.419386517s" podCreationTimestamp="2026-03-11 00:58:13 +0000 UTC" firstStartedPulling="2026-03-11 00:58:15.49046148 +0000 UTC m=+252.294679085" lastFinishedPulling="2026-03-11 00:59:03.771768173 +0000 UTC m=+300.575985818" observedRunningTime="2026-03-11 00:59:04.41865264 +0000 UTC m=+301.222870315" watchObservedRunningTime="2026-03-11 00:59:04.419386517 +0000 UTC m=+301.223604152" Mar 11 00:59:05 crc kubenswrapper[4744]: I0311 00:59:05.356673 4744 generic.go:334] "Generic (PLEG): container finished" podID="a870760b-88e5-4526-8f91-ef89201e2a13" containerID="76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c" exitCode=0 Mar 11 00:59:05 crc kubenswrapper[4744]: I0311 00:59:05.356774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerDied","Data":"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c"} Mar 11 00:59:05 crc kubenswrapper[4744]: I0311 00:59:05.375886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerStarted","Data":"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d"} Mar 11 00:59:05 crc kubenswrapper[4744]: I0311 00:59:05.412316 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wr9vs" podStartSLOduration=2.344662224 podStartE2EDuration="48.412287177s" podCreationTimestamp="2026-03-11 00:58:17 +0000 UTC" firstStartedPulling="2026-03-11 00:58:18.677168122 +0000 UTC m=+255.481385727" lastFinishedPulling="2026-03-11 00:59:04.744793035 +0000 UTC m=+301.549010680" observedRunningTime="2026-03-11 00:59:05.41104298 +0000 UTC m=+302.215260655" watchObservedRunningTime="2026-03-11 00:59:05.412287177 +0000 UTC m=+302.216504812" Mar 11 00:59:05 crc kubenswrapper[4744]: I0311 00:59:05.454398 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-kk7lx" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="registry-server" probeResult="failure" output=< Mar 11 00:59:05 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 00:59:05 crc kubenswrapper[4744]: > Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.108377 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.108442 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.162908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.367030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerStarted","Data":"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c"} Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.368312 4744 generic.go:334] "Generic (PLEG): container finished" podID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerID="8c97fba161ce219ac7c1bf9345d946770d408b3ce38db603d5963787e3fe71bf" exitCode=0 Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.368381 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerDied","Data":"8c97fba161ce219ac7c1bf9345d946770d408b3ce38db603d5963787e3fe71bf"} Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.370642 4744 generic.go:334] "Generic (PLEG): container finished" podID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerID="548ce83ca4de4ebf04137990470ec3c6f6d28d7aa274a3652d717450d6f4732a" exitCode=0 Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.370708 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerDied","Data":"548ce83ca4de4ebf04137990470ec3c6f6d28d7aa274a3652d717450d6f4732a"} Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.390788 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n9qr2" podStartSLOduration=3.107750253 podStartE2EDuration="53.390771654s" podCreationTimestamp="2026-03-11 00:58:13 +0000 UTC" firstStartedPulling="2026-03-11 00:58:15.499796477 +0000 UTC m=+252.304014082" lastFinishedPulling="2026-03-11 00:59:05.782817858 +0000 UTC m=+302.587035483" observedRunningTime="2026-03-11 00:59:06.388907585 +0000 UTC m=+303.193125190" watchObservedRunningTime="2026-03-11 00:59:06.390771654 +0000 UTC m=+303.194989259" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.414853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.529712 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:59:06 crc kubenswrapper[4744]: I0311 00:59:06.696444 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.378922 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerStarted","Data":"ffde99533538961db65ad7391ecd183011646d72ace5fcdb7e82034f86833ffc"} Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.383913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerStarted","Data":"ef263e117796496eafdfb8df295579b3dc85286077039f7acc4363ddde363286"} Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.403928 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zdvr" podStartSLOduration=3.125705567 podStartE2EDuration="53.403902373s" podCreationTimestamp="2026-03-11 00:58:14 +0000 UTC" firstStartedPulling="2026-03-11 00:58:16.531061191 +0000 UTC m=+253.335278786" lastFinishedPulling="2026-03-11 00:59:06.809257977 +0000 UTC m=+303.613475592" observedRunningTime="2026-03-11 00:59:07.401271794 +0000 UTC m=+304.205489389" watchObservedRunningTime="2026-03-11 00:59:07.403902373 +0000 UTC m=+304.208120008" Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.426955 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qqw9j" podStartSLOduration=3.186100711 podStartE2EDuration="53.426935388s" podCreationTimestamp="2026-03-11 00:58:14 +0000 UTC" firstStartedPulling="2026-03-11 00:58:16.546468315 +0000 UTC m=+253.350685910" lastFinishedPulling="2026-03-11 00:59:06.787302952 +0000 UTC m=+303.591520587" observedRunningTime="2026-03-11 00:59:07.424831719 +0000 UTC m=+304.229049354" watchObservedRunningTime="2026-03-11 00:59:07.426935388 +0000 UTC m=+304.231152993" Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.520249 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.520652 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:59:07 crc kubenswrapper[4744]: I0311 00:59:07.954778 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:59:08 crc kubenswrapper[4744]: I0311 00:59:08.005645 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:59:08 crc kubenswrapper[4744]: I0311 00:59:08.578772 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wr9vs" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="registry-server" probeResult="failure" output=< Mar 11 00:59:08 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 00:59:08 crc kubenswrapper[4744]: > Mar 11 00:59:09 crc kubenswrapper[4744]: I0311 00:59:09.622931 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:59:09 crc kubenswrapper[4744]: I0311 00:59:09.623168 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nvxtd" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="registry-server" containerID="cri-o://2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641" gracePeriod=2 Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.168847 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.280686 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content\") pod \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.280746 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c69df\" (UniqueName: \"kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df\") pod \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.280814 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities\") pod \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\" (UID: \"5ce17bbe-ec69-4349-acbc-4e99fcfb894f\") " Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.282813 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities" (OuterVolumeSpecName: "utilities") pod "5ce17bbe-ec69-4349-acbc-4e99fcfb894f" (UID: "5ce17bbe-ec69-4349-acbc-4e99fcfb894f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.290141 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df" (OuterVolumeSpecName: "kube-api-access-c69df") pod "5ce17bbe-ec69-4349-acbc-4e99fcfb894f" (UID: "5ce17bbe-ec69-4349-acbc-4e99fcfb894f"). InnerVolumeSpecName "kube-api-access-c69df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.316764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ce17bbe-ec69-4349-acbc-4e99fcfb894f" (UID: "5ce17bbe-ec69-4349-acbc-4e99fcfb894f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.382781 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.382817 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c69df\" (UniqueName: \"kubernetes.io/projected/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-kube-api-access-c69df\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.382829 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce17bbe-ec69-4349-acbc-4e99fcfb894f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.406721 4744 generic.go:334] "Generic (PLEG): container finished" podID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerID="2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641" exitCode=0 Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.406795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerDied","Data":"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641"} Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.406839 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvxtd" event={"ID":"5ce17bbe-ec69-4349-acbc-4e99fcfb894f","Type":"ContainerDied","Data":"4f3a26561bc7016404142867234ef57100a2c8c3bc19a8eeb6817b0d7cd5c502"} Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.406890 4744 scope.go:117] "RemoveContainer" containerID="2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.407156 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvxtd" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.433273 4744 scope.go:117] "RemoveContainer" containerID="5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.452629 4744 scope.go:117] "RemoveContainer" containerID="3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.453814 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.456774 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvxtd"] Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.488833 4744 scope.go:117] "RemoveContainer" containerID="2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641" Mar 11 00:59:10 crc kubenswrapper[4744]: E0311 00:59:10.489320 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641\": container with ID starting with 2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641 not found: ID does not exist" containerID="2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.489358 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641"} err="failed to get container status \"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641\": rpc error: code = NotFound desc = could not find container \"2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641\": container with ID starting with 2517478864c24ae0329cb1005883b82dc6238aca8e9fa25f4e111b50df13f641 not found: ID does not exist" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.489424 4744 scope.go:117] "RemoveContainer" containerID="5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e" Mar 11 00:59:10 crc kubenswrapper[4744]: E0311 00:59:10.489930 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e\": container with ID starting with 5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e not found: ID does not exist" containerID="5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.489950 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e"} err="failed to get container status \"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e\": rpc error: code = NotFound desc = could not find container \"5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e\": container with ID starting with 5476f8854ea589094226ebafae12607214bf479f0d115b333a9c1cc5b8115f4e not found: ID does not exist" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.489964 4744 scope.go:117] "RemoveContainer" containerID="3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e" Mar 11 00:59:10 crc kubenswrapper[4744]: E0311 00:59:10.490403 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e\": container with ID starting with 3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e not found: ID does not exist" containerID="3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.490447 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e"} err="failed to get container status \"3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e\": rpc error: code = NotFound desc = could not find container \"3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e\": container with ID starting with 3f3fdb196688f3c356efd3fced9d1e4c1de9d42609a258bcff6a706cf617e82e not found: ID does not exist" Mar 11 00:59:10 crc kubenswrapper[4744]: I0311 00:59:10.998137 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerName="oauth-openshift" containerID="cri-o://e604778b27de583f441bbc0c7365c48b9e027a61c73c490e9714df20eb53d75a" gracePeriod=15 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.068864 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.069264 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" podUID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" containerName="controller-manager" containerID="cri-o://67ef97523c51d73e79e5d0e53184c3ea1b093c3a361d11977b27894fae53a3b3" gracePeriod=30 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.085435 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.085662 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" podUID="1352afda-75ac-4f8a-af81-28cb34d11373" containerName="route-controller-manager" containerID="cri-o://2ae15322d2f83fc05b4f04924512996418b2e3f9cf00d651d3519297adcaf6c3" gracePeriod=30 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.422903 4744 generic.go:334] "Generic (PLEG): container finished" podID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" containerID="67ef97523c51d73e79e5d0e53184c3ea1b093c3a361d11977b27894fae53a3b3" exitCode=0 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.423109 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" event={"ID":"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6","Type":"ContainerDied","Data":"67ef97523c51d73e79e5d0e53184c3ea1b093c3a361d11977b27894fae53a3b3"} Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.432857 4744 generic.go:334] "Generic (PLEG): container finished" podID="1352afda-75ac-4f8a-af81-28cb34d11373" containerID="2ae15322d2f83fc05b4f04924512996418b2e3f9cf00d651d3519297adcaf6c3" exitCode=0 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.432942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" event={"ID":"1352afda-75ac-4f8a-af81-28cb34d11373","Type":"ContainerDied","Data":"2ae15322d2f83fc05b4f04924512996418b2e3f9cf00d651d3519297adcaf6c3"} Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.434770 4744 generic.go:334] "Generic (PLEG): container finished" podID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerID="e604778b27de583f441bbc0c7365c48b9e027a61c73c490e9714df20eb53d75a" exitCode=0 Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.434794 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" event={"ID":"74a61e39-2210-4bb1-96c9-509eda04c4c7","Type":"ContainerDied","Data":"e604778b27de583f441bbc0c7365c48b9e027a61c73c490e9714df20eb53d75a"} Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.479384 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603029 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603083 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603114 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603148 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603193 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603248 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603273 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkf84\" (UniqueName: \"kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603308 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603370 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603392 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603414 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603455 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies\") pod \"74a61e39-2210-4bb1-96c9-509eda04c4c7\" (UID: \"74a61e39-2210-4bb1-96c9-509eda04c4c7\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.603883 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.604157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.605104 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.605386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.605774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.611587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.612358 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84" (OuterVolumeSpecName: "kube-api-access-bkf84") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "kube-api-access-bkf84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.612461 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.612697 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.613837 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.614155 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.615352 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.620764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.622060 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "74a61e39-2210-4bb1-96c9-509eda04c4c7" (UID: "74a61e39-2210-4bb1-96c9-509eda04c4c7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.699294 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.704917 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.704979 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705012 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705042 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705066 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705120 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705139 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705160 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkf84\" (UniqueName: \"kubernetes.io/projected/74a61e39-2210-4bb1-96c9-509eda04c4c7-kube-api-access-bkf84\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705178 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705196 4744 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705215 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705235 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705257 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74a61e39-2210-4bb1-96c9-509eda04c4c7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.705275 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74a61e39-2210-4bb1-96c9-509eda04c4c7-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.806186 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca\") pod \"1352afda-75ac-4f8a-af81-28cb34d11373\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.806239 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s\") pod \"1352afda-75ac-4f8a-af81-28cb34d11373\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.806316 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert\") pod \"1352afda-75ac-4f8a-af81-28cb34d11373\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.806336 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config\") pod \"1352afda-75ac-4f8a-af81-28cb34d11373\" (UID: \"1352afda-75ac-4f8a-af81-28cb34d11373\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.807132 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config" (OuterVolumeSpecName: "config") pod "1352afda-75ac-4f8a-af81-28cb34d11373" (UID: "1352afda-75ac-4f8a-af81-28cb34d11373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.807637 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca" (OuterVolumeSpecName: "client-ca") pod "1352afda-75ac-4f8a-af81-28cb34d11373" (UID: "1352afda-75ac-4f8a-af81-28cb34d11373"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.810728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1352afda-75ac-4f8a-af81-28cb34d11373" (UID: "1352afda-75ac-4f8a-af81-28cb34d11373"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.813329 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s" (OuterVolumeSpecName: "kube-api-access-7zj7s") pod "1352afda-75ac-4f8a-af81-28cb34d11373" (UID: "1352afda-75ac-4f8a-af81-28cb34d11373"). InnerVolumeSpecName "kube-api-access-7zj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.839136 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.910859 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r87wr\" (UniqueName: \"kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr\") pod \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.910929 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert\") pod \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.910985 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca\") pod \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911013 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles\") pod \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config\") pod \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\" (UID: \"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6\") " Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911312 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1352afda-75ac-4f8a-af81-28cb34d11373-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911333 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911348 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1352afda-75ac-4f8a-af81-28cb34d11373-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911360 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/1352afda-75ac-4f8a-af81-28cb34d11373-kube-api-access-7zj7s\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911869 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" (UID: "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.911953 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" (UID: "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.912013 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config" (OuterVolumeSpecName: "config") pod "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" (UID: "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.914562 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" (UID: "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.918059 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr" (OuterVolumeSpecName: "kube-api-access-r87wr") pod "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" (UID: "dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6"). InnerVolumeSpecName "kube-api-access-r87wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:11 crc kubenswrapper[4744]: I0311 00:59:11.982676 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" path="/var/lib/kubelet/pods/5ce17bbe-ec69-4349-acbc-4e99fcfb894f/volumes" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.012207 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.012256 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r87wr\" (UniqueName: \"kubernetes.io/projected/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-kube-api-access-r87wr\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.012275 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.012291 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.012308 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.021040 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.021433 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6jf7l" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="registry-server" containerID="cri-o://e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148" gracePeriod=2 Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312446 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.312848 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1352afda-75ac-4f8a-af81-28cb34d11373" containerName="route-controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312868 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1352afda-75ac-4f8a-af81-28cb34d11373" containerName="route-controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.312893 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c454621-190e-4962-abed-72c0ec0613de" containerName="oc" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312905 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c454621-190e-4962-abed-72c0ec0613de" containerName="oc" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.312921 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerName="oauth-openshift" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312934 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerName="oauth-openshift" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.312954 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" containerName="controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312967 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" containerName="controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.312984 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="registry-server" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.312997 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="registry-server" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.313018 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="extract-content" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313029 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="extract-content" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.313050 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="extract-utilities" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313062 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="extract-utilities" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313220 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" containerName="oauth-openshift" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313248 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce17bbe-ec69-4349-acbc-4e99fcfb894f" containerName="registry-server" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313270 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" containerName="controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313283 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c454621-190e-4962-abed-72c0ec0613de" containerName="oc" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313300 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1352afda-75ac-4f8a-af81-28cb34d11373" containerName="route-controller-manager" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.313887 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.322174 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.322948 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.331697 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.338146 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.407893 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.408851 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.408914 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.408962 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.409543 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.409623 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e" gracePeriod=600 Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.419948 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420011 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcz9b\" (UniqueName: \"kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420050 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420156 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420191 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77svw\" (UniqueName: \"kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.420230 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.453603 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" event={"ID":"1352afda-75ac-4f8a-af81-28cb34d11373","Type":"ContainerDied","Data":"31eb3785b0ab1b00f6d0e4b1b6162d7ed15a97f0c475578bb354ecf2bc293966"} Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.453634 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.453675 4744 scope.go:117] "RemoveContainer" containerID="2ae15322d2f83fc05b4f04924512996418b2e3f9cf00d651d3519297adcaf6c3" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.460484 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.460478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4tcts" event={"ID":"74a61e39-2210-4bb1-96c9-509eda04c4c7","Type":"ContainerDied","Data":"509584436b00c9a81ac31a7f4f18c1243a6264d7856713053e14130a82fe9cb5"} Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.480031 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jf7l" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.480482 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerDied","Data":"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148"} Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.479802 4744 generic.go:334] "Generic (PLEG): container finished" podID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerID="e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148" exitCode=0 Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.481617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jf7l" event={"ID":"e1b5d764-9e1d-4be7-b365-85482c4e0def","Type":"ContainerDied","Data":"ee5dc7a229a0c02d55a2b693776c3bdd528527fad3a6740c76561164873af473"} Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.500247 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.507267 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf64b8966-dgsll"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.509856 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" event={"ID":"dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6","Type":"ContainerDied","Data":"3b6fc0f95a612b2dbc8b1b25caaf7862d5f634d4e7850d2cb995d76f41eae72d"} Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.509917 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84b5546df4-ckw47" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.512521 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.515939 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4tcts"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.519277 4744 scope.go:117] "RemoveContainer" containerID="e604778b27de583f441bbc0c7365c48b9e027a61c73c490e9714df20eb53d75a" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.521619 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content\") pod \"e1b5d764-9e1d-4be7-b365-85482c4e0def\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.521747 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities\") pod \"e1b5d764-9e1d-4be7-b365-85482c4e0def\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.521778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skfcr\" (UniqueName: \"kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr\") pod \"e1b5d764-9e1d-4be7-b365-85482c4e0def\" (UID: \"e1b5d764-9e1d-4be7-b365-85482c4e0def\") " Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.521958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcz9b\" (UniqueName: \"kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522002 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522034 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522121 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522172 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77svw\" (UniqueName: \"kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522195 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522223 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.522270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.525151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.526083 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.528252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.528601 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.529445 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.531090 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities" (OuterVolumeSpecName: "utilities") pod "e1b5d764-9e1d-4be7-b365-85482c4e0def" (UID: "e1b5d764-9e1d-4be7-b365-85482c4e0def"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.531929 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.533194 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.534533 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.537751 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr" (OuterVolumeSpecName: "kube-api-access-skfcr") pod "e1b5d764-9e1d-4be7-b365-85482c4e0def" (UID: "e1b5d764-9e1d-4be7-b365-85482c4e0def"). InnerVolumeSpecName "kube-api-access-skfcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.541453 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84b5546df4-ckw47"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.543922 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcz9b\" (UniqueName: \"kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b\") pod \"route-controller-manager-7fcdb7cb78-69xd7\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.544894 4744 scope.go:117] "RemoveContainer" containerID="e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.546262 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77svw\" (UniqueName: \"kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw\") pod \"controller-manager-6459b565c7-n9rsh\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.559213 4744 scope.go:117] "RemoveContainer" containerID="10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.576818 4744 scope.go:117] "RemoveContainer" containerID="89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.592558 4744 scope.go:117] "RemoveContainer" containerID="e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.592826 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148\": container with ID starting with e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148 not found: ID does not exist" containerID="e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.592864 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148"} err="failed to get container status \"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148\": rpc error: code = NotFound desc = could not find container \"e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148\": container with ID starting with e0968aa306b02f8263ae0a7506feb8acfadbf47e0075227db7d4516dd4775148 not found: ID does not exist" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.592890 4744 scope.go:117] "RemoveContainer" containerID="10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.593148 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6\": container with ID starting with 10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6 not found: ID does not exist" containerID="10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.593174 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6"} err="failed to get container status \"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6\": rpc error: code = NotFound desc = could not find container \"10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6\": container with ID starting with 10878c6bbf9561df6232073cb2911599e404187c087ba631ea0762979fd139d6 not found: ID does not exist" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.593198 4744 scope.go:117] "RemoveContainer" containerID="89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d" Mar 11 00:59:12 crc kubenswrapper[4744]: E0311 00:59:12.593448 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d\": container with ID starting with 89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d not found: ID does not exist" containerID="89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.593470 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d"} err="failed to get container status \"89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d\": rpc error: code = NotFound desc = could not find container \"89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d\": container with ID starting with 89550981a6b83f61728d9ecd73ef1bba91bc16b9da46bbf6d3079e4c4b1ef00d not found: ID does not exist" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.593484 4744 scope.go:117] "RemoveContainer" containerID="67ef97523c51d73e79e5d0e53184c3ea1b093c3a361d11977b27894fae53a3b3" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.624684 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.624718 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skfcr\" (UniqueName: \"kubernetes.io/projected/e1b5d764-9e1d-4be7-b365-85482c4e0def-kube-api-access-skfcr\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.653641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.653784 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1b5d764-9e1d-4be7-b365-85482c4e0def" (UID: "e1b5d764-9e1d-4be7-b365-85482c4e0def"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.669112 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.726195 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b5d764-9e1d-4be7-b365-85482c4e0def-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.845035 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.849579 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6jf7l"] Mar 11 00:59:12 crc kubenswrapper[4744]: I0311 00:59:12.925745 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:12 crc kubenswrapper[4744]: W0311 00:59:12.943853 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e625d0_03f2_4be3_b6dc_d1bf1816446f.slice/crio-4e30ebd5c8fb53ee41cc57da0b54f7c6ced66aeddbcc71510fce93d0bea6eadc WatchSource:0}: Error finding container 4e30ebd5c8fb53ee41cc57da0b54f7c6ced66aeddbcc71510fce93d0bea6eadc: Status 404 returned error can't find the container with id 4e30ebd5c8fb53ee41cc57da0b54f7c6ced66aeddbcc71510fce93d0bea6eadc Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.098103 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.518051 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e" exitCode=0 Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.518128 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.518677 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.521776 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" event={"ID":"89e625d0-03f2-4be3-b6dc-d1bf1816446f","Type":"ContainerStarted","Data":"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.521806 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" event={"ID":"89e625d0-03f2-4be3-b6dc-d1bf1816446f","Type":"ContainerStarted","Data":"4e30ebd5c8fb53ee41cc57da0b54f7c6ced66aeddbcc71510fce93d0bea6eadc"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.522060 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.524866 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" event={"ID":"4bff2396-e50e-4314-8616-3af64254e86c","Type":"ContainerStarted","Data":"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.524941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" event={"ID":"4bff2396-e50e-4314-8616-3af64254e86c","Type":"ContainerStarted","Data":"5bcb39b8761beece6d1138f30110722e52a99577a2e20ab469c376d0a8755b41"} Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.525108 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.531462 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.558877 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" podStartSLOduration=2.558855465 podStartE2EDuration="2.558855465s" podCreationTimestamp="2026-03-11 00:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:13.551865932 +0000 UTC m=+310.356083547" watchObservedRunningTime="2026-03-11 00:59:13.558855465 +0000 UTC m=+310.363073070" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.586491 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.617401 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" podStartSLOduration=2.617378272 podStartE2EDuration="2.617378272s" podCreationTimestamp="2026-03-11 00:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:13.57765504 +0000 UTC m=+310.381872655" watchObservedRunningTime="2026-03-11 00:59:13.617378272 +0000 UTC m=+310.421595887" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.985802 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1352afda-75ac-4f8a-af81-28cb34d11373" path="/var/lib/kubelet/pods/1352afda-75ac-4f8a-af81-28cb34d11373/volumes" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.986336 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a61e39-2210-4bb1-96c9-509eda04c4c7" path="/var/lib/kubelet/pods/74a61e39-2210-4bb1-96c9-509eda04c4c7/volumes" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.986970 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6" path="/var/lib/kubelet/pods/dd0cbcba-6518-4fdf-be7f-ec2ec7dbbaf6/volumes" Mar 11 00:59:13 crc kubenswrapper[4744]: I0311 00:59:13.987422 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" path="/var/lib/kubelet/pods/e1b5d764-9e1d-4be7-b365-85482c4e0def/volumes" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.101885 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.101976 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.181099 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.454917 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.511967 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.530026 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.530492 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.595046 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.608317 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.734469 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.734882 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:14 crc kubenswrapper[4744]: I0311 00:59:14.783130 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:15 crc kubenswrapper[4744]: I0311 00:59:15.598113 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:15 crc kubenswrapper[4744]: I0311 00:59:15.613337 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.317022 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-8fqpp"] Mar 11 00:59:16 crc kubenswrapper[4744]: E0311 00:59:16.317609 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="extract-utilities" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.317624 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="extract-utilities" Mar 11 00:59:16 crc kubenswrapper[4744]: E0311 00:59:16.317639 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="extract-content" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.317648 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="extract-content" Mar 11 00:59:16 crc kubenswrapper[4744]: E0311 00:59:16.317664 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="registry-server" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.317672 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="registry-server" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.317791 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b5d764-9e1d-4be7-b365-85482c4e0def" containerName="registry-server" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.318283 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.321607 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.322245 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.323287 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.323938 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.323960 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.324725 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.324894 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.324841 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.325737 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.327139 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.327441 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.327821 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.337561 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-8fqpp"] Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.339616 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.343297 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.367154 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.375439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.375787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.375894 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-policies\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376178 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376247 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376404 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-dir\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdrtm\" (UniqueName: \"kubernetes.io/projected/0224e8f7-f768-4b9f-816a-13bf7bf8250f-kube-api-access-vdrtm\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376823 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.376931 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.427009 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-dir\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdrtm\" (UniqueName: \"kubernetes.io/projected/0224e8f7-f768-4b9f-816a-13bf7bf8250f-kube-api-access-vdrtm\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478770 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-dir\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478818 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478850 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478882 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478939 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478964 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-policies\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.478996 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.479019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.479042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.480721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.481725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.482220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.484891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0224e8f7-f768-4b9f-816a-13bf7bf8250f-audit-policies\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.487912 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.488052 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.488800 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.489076 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.489751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.490699 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.491816 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.501578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0224e8f7-f768-4b9f-816a-13bf7bf8250f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.505703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdrtm\" (UniqueName: \"kubernetes.io/projected/0224e8f7-f768-4b9f-816a-13bf7bf8250f-kube-api-access-vdrtm\") pod \"oauth-openshift-75566f9bd7-8fqpp\" (UID: \"0224e8f7-f768-4b9f-816a-13bf7bf8250f\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:16 crc kubenswrapper[4744]: I0311 00:59:16.656136 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.021588 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.158041 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-8fqpp"] Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.563289 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" event={"ID":"0224e8f7-f768-4b9f-816a-13bf7bf8250f","Type":"ContainerStarted","Data":"c28625e943a98aef073bfd46cd894369efdc58adcdb7b9c9fae316c55a1fe7cb"} Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.563502 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qqw9j" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="registry-server" containerID="cri-o://ef263e117796496eafdfb8df295579b3dc85286077039f7acc4363ddde363286" gracePeriod=2 Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.563685 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9zdvr" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="registry-server" containerID="cri-o://ffde99533538961db65ad7391ecd183011646d72ace5fcdb7e82034f86833ffc" gracePeriod=2 Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.580219 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:59:17 crc kubenswrapper[4744]: I0311 00:59:17.647504 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.574428 4744 generic.go:334] "Generic (PLEG): container finished" podID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerID="ef263e117796496eafdfb8df295579b3dc85286077039f7acc4363ddde363286" exitCode=0 Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.574482 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerDied","Data":"ef263e117796496eafdfb8df295579b3dc85286077039f7acc4363ddde363286"} Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.578074 4744 generic.go:334] "Generic (PLEG): container finished" podID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerID="ffde99533538961db65ad7391ecd183011646d72ace5fcdb7e82034f86833ffc" exitCode=0 Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.578161 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerDied","Data":"ffde99533538961db65ad7391ecd183011646d72ace5fcdb7e82034f86833ffc"} Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.580049 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" event={"ID":"0224e8f7-f768-4b9f-816a-13bf7bf8250f","Type":"ContainerStarted","Data":"fab366ea1584016cfa64644c2cd53c763ce69e88f342a07f875772729072d0bb"} Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.622631 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" podStartSLOduration=33.622602266 podStartE2EDuration="33.622602266s" podCreationTimestamp="2026-03-11 00:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:18.609438272 +0000 UTC m=+315.413655887" watchObservedRunningTime="2026-03-11 00:59:18.622602266 +0000 UTC m=+315.426819891" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.859938 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.916633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities\") pod \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.916676 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content\") pod \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.916751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbsj6\" (UniqueName: \"kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6\") pod \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\" (UID: \"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe\") " Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.917350 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities" (OuterVolumeSpecName: "utilities") pod "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" (UID: "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.921444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6" (OuterVolumeSpecName: "kube-api-access-cbsj6") pod "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" (UID: "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe"). InnerVolumeSpecName "kube-api-access-cbsj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.959058 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:18 crc kubenswrapper[4744]: I0311 00:59:18.973073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" (UID: "19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.017912 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities\") pod \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbm2q\" (UniqueName: \"kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q\") pod \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018059 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content\") pod \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\" (UID: \"dc99af5e-bf41-49ac-8e4a-416f565cbfc9\") " Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018249 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbsj6\" (UniqueName: \"kubernetes.io/projected/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-kube-api-access-cbsj6\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018261 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018272 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.018677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities" (OuterVolumeSpecName: "utilities") pod "dc99af5e-bf41-49ac-8e4a-416f565cbfc9" (UID: "dc99af5e-bf41-49ac-8e4a-416f565cbfc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.026629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q" (OuterVolumeSpecName: "kube-api-access-jbm2q") pod "dc99af5e-bf41-49ac-8e4a-416f565cbfc9" (UID: "dc99af5e-bf41-49ac-8e4a-416f565cbfc9"). InnerVolumeSpecName "kube-api-access-jbm2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.069770 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc99af5e-bf41-49ac-8e4a-416f565cbfc9" (UID: "dc99af5e-bf41-49ac-8e4a-416f565cbfc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.119197 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.119230 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.119241 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbm2q\" (UniqueName: \"kubernetes.io/projected/dc99af5e-bf41-49ac-8e4a-416f565cbfc9-kube-api-access-jbm2q\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.587145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqw9j" event={"ID":"dc99af5e-bf41-49ac-8e4a-416f565cbfc9","Type":"ContainerDied","Data":"9ba30dae4eade289d2d3a632a8d4670c27925cc289f4f5ba34d51c8926b7dc67"} Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.587179 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqw9j" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.587213 4744 scope.go:117] "RemoveContainer" containerID="ef263e117796496eafdfb8df295579b3dc85286077039f7acc4363ddde363286" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.590067 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdvr" event={"ID":"19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe","Type":"ContainerDied","Data":"2df54e782d73ddaed638d5d19d951ca97d7e3f6c2f3de135e0a5204ecd1ee8ef"} Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.590089 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdvr" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.590309 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.602989 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75566f9bd7-8fqpp" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.606897 4744 scope.go:117] "RemoveContainer" containerID="8c97fba161ce219ac7c1bf9345d946770d408b3ce38db603d5963787e3fe71bf" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.644075 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.644963 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qqw9j"] Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.660268 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.663913 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9zdvr"] Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.688259 4744 scope.go:117] "RemoveContainer" containerID="631b1eb69f9b4638de5ac0d6d82ee56d7f16a5b14b07fd079958d79533493fac" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.713138 4744 scope.go:117] "RemoveContainer" containerID="ffde99533538961db65ad7391ecd183011646d72ace5fcdb7e82034f86833ffc" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.730694 4744 scope.go:117] "RemoveContainer" containerID="548ce83ca4de4ebf04137990470ec3c6f6d28d7aa274a3652d717450d6f4732a" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.754195 4744 scope.go:117] "RemoveContainer" containerID="f02016587b8ba1e3ccf46458c3752076907458feb5ab8d6ccf9c6b8063e296b4" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.984254 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" path="/var/lib/kubelet/pods/19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe/volumes" Mar 11 00:59:19 crc kubenswrapper[4744]: I0311 00:59:19.985046 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" path="/var/lib/kubelet/pods/dc99af5e-bf41-49ac-8e4a-416f565cbfc9/volumes" Mar 11 00:59:20 crc kubenswrapper[4744]: I0311 00:59:20.125573 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.030410 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.031119 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" podUID="4bff2396-e50e-4314-8616-3af64254e86c" containerName="controller-manager" containerID="cri-o://e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392" gracePeriod=30 Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.131610 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.131825 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" podUID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" containerName="route-controller-manager" containerID="cri-o://90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d" gracePeriod=30 Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.633781 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.654125 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.684415 4744 generic.go:334] "Generic (PLEG): container finished" podID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" containerID="90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d" exitCode=0 Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.684718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" event={"ID":"89e625d0-03f2-4be3-b6dc-d1bf1816446f","Type":"ContainerDied","Data":"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d"} Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.684859 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" event={"ID":"89e625d0-03f2-4be3-b6dc-d1bf1816446f","Type":"ContainerDied","Data":"4e30ebd5c8fb53ee41cc57da0b54f7c6ced66aeddbcc71510fce93d0bea6eadc"} Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.684947 4744 scope.go:117] "RemoveContainer" containerID="90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.685129 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.688746 4744 generic.go:334] "Generic (PLEG): container finished" podID="4bff2396-e50e-4314-8616-3af64254e86c" containerID="e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392" exitCode=0 Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.688845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" event={"ID":"4bff2396-e50e-4314-8616-3af64254e86c","Type":"ContainerDied","Data":"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392"} Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.688960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" event={"ID":"4bff2396-e50e-4314-8616-3af64254e86c","Type":"ContainerDied","Data":"5bcb39b8761beece6d1138f30110722e52a99577a2e20ab469c376d0a8755b41"} Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.688933 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6459b565c7-n9rsh" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.700188 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca\") pod \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.700345 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert\") pod \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.700440 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config\") pod \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.700608 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcz9b\" (UniqueName: \"kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b\") pod \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\" (UID: \"89e625d0-03f2-4be3-b6dc-d1bf1816446f\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.701196 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca" (OuterVolumeSpecName: "client-ca") pod "89e625d0-03f2-4be3-b6dc-d1bf1816446f" (UID: "89e625d0-03f2-4be3-b6dc-d1bf1816446f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.702830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config" (OuterVolumeSpecName: "config") pod "89e625d0-03f2-4be3-b6dc-d1bf1816446f" (UID: "89e625d0-03f2-4be3-b6dc-d1bf1816446f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.705333 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b" (OuterVolumeSpecName: "kube-api-access-dcz9b") pod "89e625d0-03f2-4be3-b6dc-d1bf1816446f" (UID: "89e625d0-03f2-4be3-b6dc-d1bf1816446f"). InnerVolumeSpecName "kube-api-access-dcz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.706169 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89e625d0-03f2-4be3-b6dc-d1bf1816446f" (UID: "89e625d0-03f2-4be3-b6dc-d1bf1816446f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.707575 4744 scope.go:117] "RemoveContainer" containerID="90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d" Mar 11 00:59:31 crc kubenswrapper[4744]: E0311 00:59:31.708199 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d\": container with ID starting with 90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d not found: ID does not exist" containerID="90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.708244 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d"} err="failed to get container status \"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d\": rpc error: code = NotFound desc = could not find container \"90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d\": container with ID starting with 90741114cb016aab09333410e28affa1db13bd1b0755fec70568506331ac4f0d not found: ID does not exist" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.708274 4744 scope.go:117] "RemoveContainer" containerID="e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.730964 4744 scope.go:117] "RemoveContainer" containerID="e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392" Mar 11 00:59:31 crc kubenswrapper[4744]: E0311 00:59:31.731464 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392\": container with ID starting with e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392 not found: ID does not exist" containerID="e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.731506 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392"} err="failed to get container status \"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392\": rpc error: code = NotFound desc = could not find container \"e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392\": container with ID starting with e7099edda32293d8de039ab23baa253ecc0b793cd2565b8a577c386eb8dda392 not found: ID does not exist" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.801957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77svw\" (UniqueName: \"kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw\") pod \"4bff2396-e50e-4314-8616-3af64254e86c\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.802036 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca\") pod \"4bff2396-e50e-4314-8616-3af64254e86c\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.802084 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config\") pod \"4bff2396-e50e-4314-8616-3af64254e86c\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.802111 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles\") pod \"4bff2396-e50e-4314-8616-3af64254e86c\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.802144 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert\") pod \"4bff2396-e50e-4314-8616-3af64254e86c\" (UID: \"4bff2396-e50e-4314-8616-3af64254e86c\") " Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803177 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e625d0-03f2-4be3-b6dc-d1bf1816446f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803229 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803243 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcz9b\" (UniqueName: \"kubernetes.io/projected/89e625d0-03f2-4be3-b6dc-d1bf1816446f-kube-api-access-dcz9b\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803263 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89e625d0-03f2-4be3-b6dc-d1bf1816446f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803465 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca" (OuterVolumeSpecName: "client-ca") pod "4bff2396-e50e-4314-8616-3af64254e86c" (UID: "4bff2396-e50e-4314-8616-3af64254e86c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config" (OuterVolumeSpecName: "config") pod "4bff2396-e50e-4314-8616-3af64254e86c" (UID: "4bff2396-e50e-4314-8616-3af64254e86c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.803676 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4bff2396-e50e-4314-8616-3af64254e86c" (UID: "4bff2396-e50e-4314-8616-3af64254e86c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.806092 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4bff2396-e50e-4314-8616-3af64254e86c" (UID: "4bff2396-e50e-4314-8616-3af64254e86c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.806449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw" (OuterVolumeSpecName: "kube-api-access-77svw") pod "4bff2396-e50e-4314-8616-3af64254e86c" (UID: "4bff2396-e50e-4314-8616-3af64254e86c"). InnerVolumeSpecName "kube-api-access-77svw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.905091 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-config\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.905150 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.905173 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bff2396-e50e-4314-8616-3af64254e86c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.905195 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77svw\" (UniqueName: \"kubernetes.io/projected/4bff2396-e50e-4314-8616-3af64254e86c-kube-api-access-77svw\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:31 crc kubenswrapper[4744]: I0311 00:59:31.905214 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bff2396-e50e-4314-8616-3af64254e86c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.042736 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.047325 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcdb7cb78-69xd7"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.057244 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.062484 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6459b565c7-n9rsh"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335052 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7997d67969-rsdkt"] Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335348 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" containerName="route-controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335363 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" containerName="route-controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335380 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335390 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335405 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bff2396-e50e-4314-8616-3af64254e86c" containerName="controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335414 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bff2396-e50e-4314-8616-3af64254e86c" containerName="controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335424 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="extract-content" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335432 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="extract-content" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335450 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335458 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335472 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="extract-utilities" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335480 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="extract-utilities" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335492 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="extract-content" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335502 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="extract-content" Mar 11 00:59:32 crc kubenswrapper[4744]: E0311 00:59:32.335541 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="extract-utilities" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335549 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="extract-utilities" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335660 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc99af5e-bf41-49ac-8e4a-416f565cbfc9" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335675 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" containerName="route-controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335689 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ed9886-b3a9-4ea4-ac2a-2abd59a4b1fe" containerName="registry-server" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.335701 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bff2396-e50e-4314-8616-3af64254e86c" containerName="controller-manager" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.336142 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.340553 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.340800 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.342965 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.344144 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.345562 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.359251 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.359281 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.359468 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.359724 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.359867 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.361264 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.361378 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.361730 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.361857 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.368074 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.371567 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7997d67969-rsdkt"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.380051 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2"] Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dtf\" (UniqueName: \"kubernetes.io/projected/56fb8d15-ae54-4941-b15d-128d0622ce22-kube-api-access-x9dtf\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413107 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzd9b\" (UniqueName: \"kubernetes.io/projected/ae3c2671-f3d5-407b-809e-1e8a741eff89-kube-api-access-tzd9b\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413170 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fb8d15-ae54-4941-b15d-128d0622ce22-serving-cert\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3c2671-f3d5-407b-809e-1e8a741eff89-serving-cert\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413247 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-config\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-config\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413366 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-client-ca\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-client-ca\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.413491 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-proxy-ca-bundles\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.514254 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-config\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.514756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-config\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.514926 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-client-ca\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.515132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-client-ca\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.515336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-proxy-ca-bundles\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.515556 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dtf\" (UniqueName: \"kubernetes.io/projected/56fb8d15-ae54-4941-b15d-128d0622ce22-kube-api-access-x9dtf\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.515732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzd9b\" (UniqueName: \"kubernetes.io/projected/ae3c2671-f3d5-407b-809e-1e8a741eff89-kube-api-access-tzd9b\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.515912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fb8d15-ae54-4941-b15d-128d0622ce22-serving-cert\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.517833 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3c2671-f3d5-407b-809e-1e8a741eff89-serving-cert\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.516843 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-client-ca\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.517253 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-config\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.516819 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-client-ca\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.518458 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fb8d15-ae54-4941-b15d-128d0622ce22-config\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.518715 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae3c2671-f3d5-407b-809e-1e8a741eff89-proxy-ca-bundles\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.524696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3c2671-f3d5-407b-809e-1e8a741eff89-serving-cert\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.526414 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fb8d15-ae54-4941-b15d-128d0622ce22-serving-cert\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.544568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dtf\" (UniqueName: \"kubernetes.io/projected/56fb8d15-ae54-4941-b15d-128d0622ce22-kube-api-access-x9dtf\") pod \"route-controller-manager-7cddd5cdb6-pbkp2\" (UID: \"56fb8d15-ae54-4941-b15d-128d0622ce22\") " pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.547118 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzd9b\" (UniqueName: \"kubernetes.io/projected/ae3c2671-f3d5-407b-809e-1e8a741eff89-kube-api-access-tzd9b\") pod \"controller-manager-7997d67969-rsdkt\" (UID: \"ae3c2671-f3d5-407b-809e-1e8a741eff89\") " pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.685476 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.698958 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:32 crc kubenswrapper[4744]: I0311 00:59:32.982255 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7997d67969-rsdkt"] Mar 11 00:59:32 crc kubenswrapper[4744]: W0311 00:59:32.987301 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3c2671_f3d5_407b_809e_1e8a741eff89.slice/crio-308fccc90f191b857ced3ff2d3cce7a49d1f548f2ab3f2aa8645d4c8f2030465 WatchSource:0}: Error finding container 308fccc90f191b857ced3ff2d3cce7a49d1f548f2ab3f2aa8645d4c8f2030465: Status 404 returned error can't find the container with id 308fccc90f191b857ced3ff2d3cce7a49d1f548f2ab3f2aa8645d4c8f2030465 Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.026524 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2"] Mar 11 00:59:33 crc kubenswrapper[4744]: W0311 00:59:33.043943 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fb8d15_ae54_4941_b15d_128d0622ce22.slice/crio-1346b82c5badbf22c080597ed0d4a5da8c6a4b9bcc2b50e875552a47733050f4 WatchSource:0}: Error finding container 1346b82c5badbf22c080597ed0d4a5da8c6a4b9bcc2b50e875552a47733050f4: Status 404 returned error can't find the container with id 1346b82c5badbf22c080597ed0d4a5da8c6a4b9bcc2b50e875552a47733050f4 Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.706549 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" event={"ID":"56fb8d15-ae54-4941-b15d-128d0622ce22","Type":"ContainerStarted","Data":"ab33bb66b1f76ae9797c056c78e7892c8b8502aef7b27472cad66fb42abd8686"} Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.706589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" event={"ID":"56fb8d15-ae54-4941-b15d-128d0622ce22","Type":"ContainerStarted","Data":"1346b82c5badbf22c080597ed0d4a5da8c6a4b9bcc2b50e875552a47733050f4"} Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.706888 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.708269 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" event={"ID":"ae3c2671-f3d5-407b-809e-1e8a741eff89","Type":"ContainerStarted","Data":"9fe902b73b767ede4d5bce57d1ed9bdb788c4b38b6ca8e243be88f0a4cb689d3"} Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.708319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" event={"ID":"ae3c2671-f3d5-407b-809e-1e8a741eff89","Type":"ContainerStarted","Data":"308fccc90f191b857ced3ff2d3cce7a49d1f548f2ab3f2aa8645d4c8f2030465"} Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.708468 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.714943 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.716645 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.726040 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cddd5cdb6-pbkp2" podStartSLOduration=2.726029602 podStartE2EDuration="2.726029602s" podCreationTimestamp="2026-03-11 00:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:33.725863476 +0000 UTC m=+330.530081081" watchObservedRunningTime="2026-03-11 00:59:33.726029602 +0000 UTC m=+330.530247207" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.751723 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7997d67969-rsdkt" podStartSLOduration=2.751706579 podStartE2EDuration="2.751706579s" podCreationTimestamp="2026-03-11 00:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:33.750493736 +0000 UTC m=+330.554711341" watchObservedRunningTime="2026-03-11 00:59:33.751706579 +0000 UTC m=+330.555924194" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.986895 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bff2396-e50e-4314-8616-3af64254e86c" path="/var/lib/kubelet/pods/4bff2396-e50e-4314-8616-3af64254e86c/volumes" Mar 11 00:59:33 crc kubenswrapper[4744]: I0311 00:59:33.988255 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e625d0-03f2-4be3-b6dc-d1bf1816446f" path="/var/lib/kubelet/pods/89e625d0-03f2-4be3-b6dc-d1bf1816446f/volumes" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.210050 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.211971 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.266367 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.275337 4744 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.275814 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c" gracePeriod=15 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.275957 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3" gracePeriod=15 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.276000 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa" gracePeriod=15 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.276066 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28" gracePeriod=15 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.276064 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab" gracePeriod=15 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.283874 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284322 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284344 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284361 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284373 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284394 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284406 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284441 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284453 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284469 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284481 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284496 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284508 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284559 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284571 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284589 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284603 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.284621 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284633 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284838 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284860 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284880 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284904 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284928 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284944 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284963 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.284981 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.287041 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.287090 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.287152 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.287165 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.287345 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.287363 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.288898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.288995 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.289068 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.289136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.289247 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390370 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390450 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390504 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390630 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390723 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390850 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.391019 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.390738 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.391269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.391343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492379 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492438 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492475 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492503 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492554 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.492602 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.554712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 00:59:37 crc kubenswrapper[4744]: W0311 00:59:37.579438 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a23787193db99011f859dd93a00b4b2cd03470afab3e59866371d0cff8a36a5d WatchSource:0}: Error finding container a23787193db99011f859dd93a00b4b2cd03470afab3e59866371d0cff8a36a5d: Status 404 returned error can't find the container with id a23787193db99011f859dd93a00b4b2cd03470afab3e59866371d0cff8a36a5d Mar 11 00:59:37 crc kubenswrapper[4744]: E0311 00:59:37.584745 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ba39e10df8ebb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:59:37.583668923 +0000 UTC m=+334.387886538,LastTimestamp:2026-03-11 00:59:37.583668923 +0000 UTC m=+334.387886538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.742205 4744 generic.go:334] "Generic (PLEG): container finished" podID="516c0d23-985b-4de5-9b7c-c7651922d5d1" containerID="7821ae34742746b27397c69d882161b112c63a34c60bd33d3db066b2a792afdf" exitCode=0 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.742309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"516c0d23-985b-4de5-9b7c-c7651922d5d1","Type":"ContainerDied","Data":"7821ae34742746b27397c69d882161b112c63a34c60bd33d3db066b2a792afdf"} Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.744350 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.744994 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.745233 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.746465 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a23787193db99011f859dd93a00b4b2cd03470afab3e59866371d0cff8a36a5d"} Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.748315 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.749446 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.750295 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab" exitCode=0 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.750313 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3" exitCode=0 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.750321 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa" exitCode=0 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.750328 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28" exitCode=2 Mar 11 00:59:37 crc kubenswrapper[4744]: I0311 00:59:37.750348 4744 scope.go:117] "RemoveContainer" containerID="4b2b952a5490bff5028c83fdfb0942e84bac782d470cede57ec6a31f483a407e" Mar 11 00:59:38 crc kubenswrapper[4744]: I0311 00:59:38.767014 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2adc76989ea79bc4b3ba27a5a7c3e41f7ec56593e840c74754b6b77c54414a91"} Mar 11 00:59:38 crc kubenswrapper[4744]: I0311 00:59:38.768289 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:38 crc kubenswrapper[4744]: I0311 00:59:38.769154 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:38 crc kubenswrapper[4744]: I0311 00:59:38.772955 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.215185 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.216236 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.216688 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321385 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access\") pod \"516c0d23-985b-4de5-9b7c-c7651922d5d1\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321546 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir\") pod \"516c0d23-985b-4de5-9b7c-c7651922d5d1\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock\") pod \"516c0d23-985b-4de5-9b7c-c7651922d5d1\" (UID: \"516c0d23-985b-4de5-9b7c-c7651922d5d1\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321678 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "516c0d23-985b-4de5-9b7c-c7651922d5d1" (UID: "516c0d23-985b-4de5-9b7c-c7651922d5d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321718 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock" (OuterVolumeSpecName: "var-lock") pod "516c0d23-985b-4de5-9b7c-c7651922d5d1" (UID: "516c0d23-985b-4de5-9b7c-c7651922d5d1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321956 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.321975 4744 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/516c0d23-985b-4de5-9b7c-c7651922d5d1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.330147 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "516c0d23-985b-4de5-9b7c-c7651922d5d1" (UID: "516c0d23-985b-4de5-9b7c-c7651922d5d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.433309 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/516c0d23-985b-4de5-9b7c-c7651922d5d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.705550 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.706615 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.707364 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.707921 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.708590 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.786343 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.787643 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c" exitCode=0 Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.787776 4744 scope.go:117] "RemoveContainer" containerID="6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.787825 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.790055 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"516c0d23-985b-4de5-9b7c-c7651922d5d1","Type":"ContainerDied","Data":"ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99"} Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.790104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.790128 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad89c494731c6635289d395edc076681cbc26bffab17b54a68ec1d003a784e99" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.816227 4744 scope.go:117] "RemoveContainer" containerID="c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.817078 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.817653 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.818298 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839028 4744 scope.go:117] "RemoveContainer" containerID="e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839110 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839054 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839310 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839371 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839448 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839447 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839724 4744 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839746 4744 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.839764 4744 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.865079 4744 scope.go:117] "RemoveContainer" containerID="2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.882729 4744 scope.go:117] "RemoveContainer" containerID="31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.899583 4744 scope.go:117] "RemoveContainer" containerID="edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.926792 4744 scope.go:117] "RemoveContainer" containerID="6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.927446 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab\": container with ID starting with 6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab not found: ID does not exist" containerID="6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.927528 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab"} err="failed to get container status \"6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab\": rpc error: code = NotFound desc = could not find container \"6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab\": container with ID starting with 6dd956673c1f46da2ead777e897aa77847f3ad0108c48a521efb8dd8f4d286ab not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.927581 4744 scope.go:117] "RemoveContainer" containerID="c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.928279 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\": container with ID starting with c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3 not found: ID does not exist" containerID="c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.928325 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3"} err="failed to get container status \"c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\": rpc error: code = NotFound desc = could not find container \"c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3\": container with ID starting with c9f3a6d71079f1717c07e0b026f34d1bfd6391a5bb6745bcdab535d5290e2aa3 not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.928356 4744 scope.go:117] "RemoveContainer" containerID="e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.928707 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\": container with ID starting with e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa not found: ID does not exist" containerID="e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.928765 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa"} err="failed to get container status \"e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\": rpc error: code = NotFound desc = could not find container \"e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa\": container with ID starting with e702350d7b926ec4df6853a20470ea7515f1e1c98720db1a4d672ea4e03e82fa not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.928802 4744 scope.go:117] "RemoveContainer" containerID="2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.929359 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\": container with ID starting with 2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28 not found: ID does not exist" containerID="2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.929388 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28"} err="failed to get container status \"2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\": rpc error: code = NotFound desc = could not find container \"2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28\": container with ID starting with 2d93a4815010e9ee724175731b03bf82804921c5bbb9fe8be236057f90057f28 not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.929411 4744 scope.go:117] "RemoveContainer" containerID="31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.929801 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\": container with ID starting with 31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c not found: ID does not exist" containerID="31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.929884 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c"} err="failed to get container status \"31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\": rpc error: code = NotFound desc = could not find container \"31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c\": container with ID starting with 31713fedc28cf71b8df25e22000768f5d00e6de1b228a632aa3c51aea11eea7c not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.929948 4744 scope.go:117] "RemoveContainer" containerID="edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748" Mar 11 00:59:39 crc kubenswrapper[4744]: E0311 00:59:39.930454 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\": container with ID starting with edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748 not found: ID does not exist" containerID="edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.930502 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748"} err="failed to get container status \"edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\": rpc error: code = NotFound desc = could not find container \"edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748\": container with ID starting with edfc2acc25990d1ccbca879e186fb9dfc0a6664fa1df5cee62958ff52b217748 not found: ID does not exist" Mar 11 00:59:39 crc kubenswrapper[4744]: I0311 00:59:39.990149 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 11 00:59:40 crc kubenswrapper[4744]: I0311 00:59:40.094305 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:40 crc kubenswrapper[4744]: I0311 00:59:40.094973 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:40 crc kubenswrapper[4744]: I0311 00:59:40.095588 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:43 crc kubenswrapper[4744]: I0311 00:59:43.980255 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:43 crc kubenswrapper[4744]: I0311 00:59:43.982747 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.540367 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ba39e10df8ebb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 00:59:37.583668923 +0000 UTC m=+334.387886538,LastTimestamp:2026-03-11 00:59:37.583668923 +0000 UTC m=+334.387886538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.705100 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.705892 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.706569 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.706987 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.707395 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:45 crc kubenswrapper[4744]: I0311 00:59:45.707448 4744 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.707940 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="200ms" Mar 11 00:59:45 crc kubenswrapper[4744]: E0311 00:59:45.909394 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="400ms" Mar 11 00:59:46 crc kubenswrapper[4744]: E0311 00:59:46.310147 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="800ms" Mar 11 00:59:47 crc kubenswrapper[4744]: E0311 00:59:47.111771 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="1.6s" Mar 11 00:59:48 crc kubenswrapper[4744]: E0311 00:59:48.713626 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="3.2s" Mar 11 00:59:49 crc kubenswrapper[4744]: I0311 00:59:49.974359 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:49 crc kubenswrapper[4744]: I0311 00:59:49.976895 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:49 crc kubenswrapper[4744]: I0311 00:59:49.977335 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.001986 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.002038 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:50 crc kubenswrapper[4744]: E0311 00:59:50.002780 4744 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.003473 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:50 crc kubenswrapper[4744]: E0311 00:59:50.019795 4744 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" volumeName="registry-storage" Mar 11 00:59:50 crc kubenswrapper[4744]: W0311 00:59:50.047900 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7289332147ad879335f314cef1ea9f4a5b0e24cd57310a1523b8556d5a64a308 WatchSource:0}: Error finding container 7289332147ad879335f314cef1ea9f4a5b0e24cd57310a1523b8556d5a64a308: Status 404 returned error can't find the container with id 7289332147ad879335f314cef1ea9f4a5b0e24cd57310a1523b8556d5a64a308 Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.870906 4744 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9e6e9a2cdc27c9f62e5968c795d26d8623155807fdc2236b26bf3e47dd80635b" exitCode=0 Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.871045 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9e6e9a2cdc27c9f62e5968c795d26d8623155807fdc2236b26bf3e47dd80635b"} Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.871333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7289332147ad879335f314cef1ea9f4a5b0e24cd57310a1523b8556d5a64a308"} Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.871775 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.871798 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.872449 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:50 crc kubenswrapper[4744]: E0311 00:59:50.872704 4744 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.873249 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.876861 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.877933 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.878016 4744 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773" exitCode=1 Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.878060 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773"} Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.878792 4744 scope.go:117] "RemoveContainer" containerID="eb1e3ef7609aa323e622249594eb4ac5cd5cacfa496d8189b3f612e29cb11773" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.878857 4744 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.879220 4744 status_manager.go:851] "Failed to get status for pod" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:50 crc kubenswrapper[4744]: I0311 00:59:50.879685 4744 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 11 00:59:51 crc kubenswrapper[4744]: I0311 00:59:51.923327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f407a20728717001eadf5989cfde0e84062e676fb010b9a786870bd1670db77c"} Mar 11 00:59:51 crc kubenswrapper[4744]: I0311 00:59:51.923427 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ac4d8fa2bf400a0e420e3656ebc8b0d515cf04d0691b4a4ec5a64a08287ac9f"} Mar 11 00:59:51 crc kubenswrapper[4744]: I0311 00:59:51.936797 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 00:59:51 crc kubenswrapper[4744]: I0311 00:59:51.938066 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 00:59:51 crc kubenswrapper[4744]: I0311 00:59:51.938148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6eb976a5969c2b9254c4ef8d5cebe6452055b5546ec04a806196b810965e3c4f"} Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.946969 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4a32e74a932f9250c39f67a2e87ce1371ba50e4a6573556515794d1724b60a88"} Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.947222 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.947234 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"90f73cc6bc3b1667c4a27021dc719618b74b021f1def4bc070afde5bd3dfa57e"} Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.947243 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"655f2b8a3aa308643d533f74f91fde25784508bc2f6a1f8a1ea001d7071288fa"} Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.947339 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:52 crc kubenswrapper[4744]: I0311 00:59:52.947375 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:54 crc kubenswrapper[4744]: I0311 00:59:54.132793 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:59:54 crc kubenswrapper[4744]: I0311 00:59:54.136959 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:59:54 crc kubenswrapper[4744]: I0311 00:59:54.959927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 00:59:55 crc kubenswrapper[4744]: I0311 00:59:55.004114 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:55 crc kubenswrapper[4744]: I0311 00:59:55.004258 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:55 crc kubenswrapper[4744]: I0311 00:59:55.013728 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:57 crc kubenswrapper[4744]: I0311 00:59:57.971809 4744 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:58 crc kubenswrapper[4744]: I0311 00:59:58.046105 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3944a8c2-3ed3-4550-baca-afffde8144fe" Mar 11 00:59:58 crc kubenswrapper[4744]: I0311 00:59:58.992502 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:58 crc kubenswrapper[4744]: I0311 00:59:58.992615 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:58 crc kubenswrapper[4744]: I0311 00:59:58.997195 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3944a8c2-3ed3-4550-baca-afffde8144fe" Mar 11 00:59:59 crc kubenswrapper[4744]: I0311 00:59:59.001422 4744 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://0ac4d8fa2bf400a0e420e3656ebc8b0d515cf04d0691b4a4ec5a64a08287ac9f" Mar 11 00:59:59 crc kubenswrapper[4744]: I0311 00:59:59.001459 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 00:59:59 crc kubenswrapper[4744]: I0311 00:59:59.999584 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 00:59:59 crc kubenswrapper[4744]: I0311 00:59:59.999636 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 01:00:00 crc kubenswrapper[4744]: I0311 01:00:00.003387 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3944a8c2-3ed3-4550-baca-afffde8144fe" Mar 11 01:00:06 crc kubenswrapper[4744]: I0311 01:00:06.707830 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 01:00:07 crc kubenswrapper[4744]: I0311 01:00:07.878605 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 01:00:07 crc kubenswrapper[4744]: I0311 01:00:07.936428 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 01:00:08 crc kubenswrapper[4744]: I0311 01:00:08.301453 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 01:00:08 crc kubenswrapper[4744]: I0311 01:00:08.824623 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 01:00:08 crc kubenswrapper[4744]: I0311 01:00:08.875936 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.479853 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.569501 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.668671 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.725031 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.778131 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 01:00:09 crc kubenswrapper[4744]: I0311 01:00:09.874870 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.396250 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.403154 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.403333 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.505065 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.554881 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.585331 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.692154 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.718821 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.765917 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 01:00:10 crc kubenswrapper[4744]: I0311 01:00:10.929975 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.128083 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.141446 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.286953 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.360334 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.532817 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.568053 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.569770 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.606560 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.624492 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.743463 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.784890 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.798399 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.798439 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 01:00:11 crc kubenswrapper[4744]: I0311 01:00:11.985433 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.023289 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.046550 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.081570 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.088215 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.116271 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.118560 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.130453 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.323145 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.325013 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.394872 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.398681 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.421098 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.426040 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.461073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.495378 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.503027 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.541940 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.562143 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.606239 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.645442 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.661686 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.689703 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.691700 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.850324 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.949139 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 01:00:12 crc kubenswrapper[4744]: I0311 01:00:12.966683 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.000305 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.054906 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.103528 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.132292 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.231570 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.389903 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.417954 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.431546 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.443577 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.633058 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.637839 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.648396 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.714317 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.718418 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.727085 4744 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.759047 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.878257 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 01:00:13 crc kubenswrapper[4744]: I0311 01:00:13.988887 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.064412 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.200419 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.239407 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.297806 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.404659 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.415558 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.475483 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.524329 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.542583 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.568221 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.648630 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.661784 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.726423 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.798292 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.811179 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.840302 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.904141 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 01:00:14 crc kubenswrapper[4744]: I0311 01:00:14.986616 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.057214 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.128215 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.195062 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.240138 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.328118 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.510480 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.520721 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.681904 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.719141 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.811787 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 01:00:15 crc kubenswrapper[4744]: I0311 01:00:15.953886 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.029125 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.040198 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.086197 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.090333 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.105293 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.209874 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.282089 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.294070 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.299007 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.326179 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.415758 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.504172 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.513933 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.573489 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.685938 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.738457 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.744319 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 01:00:16 crc kubenswrapper[4744]: I0311 01:00:16.859555 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.046372 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.048194 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.051128 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.191339 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.401020 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.583114 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.644002 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.661111 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.758766 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.775346 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.778979 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 01:00:17 crc kubenswrapper[4744]: I0311 01:00:17.879560 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.026247 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.057822 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.064006 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.077503 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.161782 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.252702 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.294217 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.319711 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.384458 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.427859 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.521818 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.537006 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.578048 4744 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.606270 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.626155 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.643428 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.722470 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.740951 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.770086 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.866680 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.904568 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.907438 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.976270 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 01:00:18 crc kubenswrapper[4744]: I0311 01:00:18.985373 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.149114 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.172282 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.174076 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.192746 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.248443 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.305213 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.368160 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.387000 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.468963 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.492786 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.582928 4744 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.639721 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.686457 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.765975 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.796854 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 01:00:19 crc kubenswrapper[4744]: I0311 01:00:19.987179 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.016458 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.107435 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.119300 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.200589 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.250217 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.257222 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.345137 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.396432 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.404049 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.414525 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.516830 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.545343 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.573681 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.776623 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.821230 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.835691 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.881668 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.960585 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.973566 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 01:00:20 crc kubenswrapper[4744]: I0311 01:00:20.985744 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.102366 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.188759 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.214374 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.309600 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.340216 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.429358 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.448912 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.464282 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.489038 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.544462 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.550491 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.588689 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.611416 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.662145 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.669192 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.692257 4744 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.697239 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.697216637 podStartE2EDuration="44.697216637s" podCreationTimestamp="2026-03-11 00:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 00:59:57.926970778 +0000 UTC m=+354.731188413" watchObservedRunningTime="2026-03-11 01:00:21.697216637 +0000 UTC m=+378.501434282" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.699985 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.700052 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.700450 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.700481 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e522f1d8-5329-414c-88d5-79e6f3b615be" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.705443 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.721160 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.721128912 podStartE2EDuration="24.721128912s" podCreationTimestamp="2026-03-11 00:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:00:21.717756665 +0000 UTC m=+378.521974310" watchObservedRunningTime="2026-03-11 01:00:21.721128912 +0000 UTC m=+378.525346557" Mar 11 01:00:21 crc kubenswrapper[4744]: I0311 01:00:21.763189 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.035548 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.075431 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.082872 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.190649 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.343349 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.345806 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.499745 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.534022 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.659567 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.684788 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.688379 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.887439 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.888490 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 01:00:22 crc kubenswrapper[4744]: I0311 01:00:22.908197 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.128046 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.132751 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.203223 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.238275 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.266917 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.412601 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.650801 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.683891 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.747888 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.751268 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 01:00:23 crc kubenswrapper[4744]: I0311 01:00:23.856672 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 01:00:24 crc kubenswrapper[4744]: I0311 01:00:24.392919 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 01:00:24 crc kubenswrapper[4744]: I0311 01:00:24.455872 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 01:00:24 crc kubenswrapper[4744]: I0311 01:00:24.554961 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.057750 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.084770 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.109054 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553180-89p78"] Mar 11 01:00:25 crc kubenswrapper[4744]: E0311 01:00:25.109405 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" containerName="installer" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.109433 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" containerName="installer" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.109674 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="516c0d23-985b-4de5-9b7c-c7651922d5d1" containerName="installer" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.110324 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.113166 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.114406 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.125293 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb"] Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.125936 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.126359 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.127060 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.127503 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.129597 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553180-89p78"] Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.163497 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.177753 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wtk\" (UniqueName: \"kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk\") pod \"auto-csr-approver-29553180-89p78\" (UID: \"d246e126-a4f4-41d4-83b1-115cb6f674ec\") " pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.190000 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb"] Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.269103 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.279799 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wtk\" (UniqueName: \"kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk\") pod \"auto-csr-approver-29553180-89p78\" (UID: \"d246e126-a4f4-41d4-83b1-115cb6f674ec\") " pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.279897 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2fb\" (UniqueName: \"kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.280203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.280354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.311559 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wtk\" (UniqueName: \"kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk\") pod \"auto-csr-approver-29553180-89p78\" (UID: \"d246e126-a4f4-41d4-83b1-115cb6f674ec\") " pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.311917 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.345273 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.381361 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.381457 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.381563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2fb\" (UniqueName: \"kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.382205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.390403 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.410557 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2fb\" (UniqueName: \"kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb\") pod \"collect-profiles-29553180-h7hsb\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.426086 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.478092 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.488743 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.776438 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553180-89p78"] Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.822074 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb"] Mar 11 01:00:25 crc kubenswrapper[4744]: W0311 01:00:25.826897 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde8d0af_13f0_4eda_93e5_1bb4a99ac0d6.slice/crio-d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e WatchSource:0}: Error finding container d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e: Status 404 returned error can't find the container with id d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e Mar 11 01:00:25 crc kubenswrapper[4744]: I0311 01:00:25.980430 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 01:00:26 crc kubenswrapper[4744]: I0311 01:00:26.180525 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" event={"ID":"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6","Type":"ContainerStarted","Data":"b7de441c4af0cf24e698bfb4e00dc6fb117b9da86dde2ec839f5525f03ee4ccb"} Mar 11 01:00:26 crc kubenswrapper[4744]: I0311 01:00:26.180813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" event={"ID":"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6","Type":"ContainerStarted","Data":"d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e"} Mar 11 01:00:26 crc kubenswrapper[4744]: I0311 01:00:26.183273 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553180-89p78" event={"ID":"d246e126-a4f4-41d4-83b1-115cb6f674ec","Type":"ContainerStarted","Data":"32281b7f0e3c5f7bcbd26f4b01bc4086c978a385452bc18684c02028d8ec18a6"} Mar 11 01:00:26 crc kubenswrapper[4744]: I0311 01:00:26.198858 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" podStartSLOduration=12.198841085 podStartE2EDuration="12.198841085s" podCreationTimestamp="2026-03-11 01:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:00:26.197483727 +0000 UTC m=+383.001701332" watchObservedRunningTime="2026-03-11 01:00:26.198841085 +0000 UTC m=+383.003058690" Mar 11 01:00:26 crc kubenswrapper[4744]: I0311 01:00:26.282612 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 01:00:27 crc kubenswrapper[4744]: I0311 01:00:27.190388 4744 generic.go:334] "Generic (PLEG): container finished" podID="dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" containerID="b7de441c4af0cf24e698bfb4e00dc6fb117b9da86dde2ec839f5525f03ee4ccb" exitCode=0 Mar 11 01:00:27 crc kubenswrapper[4744]: I0311 01:00:27.190442 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" event={"ID":"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6","Type":"ContainerDied","Data":"b7de441c4af0cf24e698bfb4e00dc6fb117b9da86dde2ec839f5525f03ee4ccb"} Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.571476 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.728986 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume\") pod \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.729088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume\") pod \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.729843 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" (UID: "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.729226 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv2fb\" (UniqueName: \"kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb\") pod \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\" (UID: \"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6\") " Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.730740 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.737291 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb" (OuterVolumeSpecName: "kube-api-access-tv2fb") pod "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" (UID: "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6"). InnerVolumeSpecName "kube-api-access-tv2fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.737500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" (UID: "dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.832930 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:28 crc kubenswrapper[4744]: I0311 01:00:28.833065 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv2fb\" (UniqueName: \"kubernetes.io/projected/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6-kube-api-access-tv2fb\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:29 crc kubenswrapper[4744]: I0311 01:00:29.202672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" event={"ID":"dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6","Type":"ContainerDied","Data":"d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e"} Mar 11 01:00:29 crc kubenswrapper[4744]: I0311 01:00:29.202708 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb" Mar 11 01:00:29 crc kubenswrapper[4744]: I0311 01:00:29.202712 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67f1d489ed1cc3acaa59feb26f5bf0839b836f12e3de21ceadc54e18206e87e" Mar 11 01:00:31 crc kubenswrapper[4744]: I0311 01:00:31.965480 4744 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 01:00:31 crc kubenswrapper[4744]: I0311 01:00:31.966325 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2adc76989ea79bc4b3ba27a5a7c3e41f7ec56593e840c74754b6b77c54414a91" gracePeriod=5 Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.260973 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.261809 4744 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2adc76989ea79bc4b3ba27a5a7c3e41f7ec56593e840c74754b6b77c54414a91" exitCode=137 Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.569328 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.569435 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.663894 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.663969 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664013 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664051 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664150 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664270 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664279 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664813 4744 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664847 4744 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664871 4744 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.664896 4744 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.676967 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.766857 4744 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.986401 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 11 01:00:37 crc kubenswrapper[4744]: I0311 01:00:37.986871 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.000300 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.000362 4744 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9e51c683-d457-46fa-afab-6630dbe2e123" Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.006401 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.006446 4744 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9e51c683-d457-46fa-afab-6630dbe2e123" Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.271421 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.271934 4744 scope.go:117] "RemoveContainer" containerID="2adc76989ea79bc4b3ba27a5a7c3e41f7ec56593e840c74754b6b77c54414a91" Mar 11 01:00:38 crc kubenswrapper[4744]: I0311 01:00:38.272045 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 01:00:48 crc kubenswrapper[4744]: I0311 01:00:48.336157 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553180-89p78" event={"ID":"d246e126-a4f4-41d4-83b1-115cb6f674ec","Type":"ContainerStarted","Data":"3741a4b098ab9f06f8a342cf1d6e93b93d68d1bc46c5ebd60097f1532fc283bd"} Mar 11 01:00:48 crc kubenswrapper[4744]: I0311 01:00:48.360269 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553180-89p78" podStartSLOduration=12.265480342 podStartE2EDuration="34.360247856s" podCreationTimestamp="2026-03-11 01:00:14 +0000 UTC" firstStartedPulling="2026-03-11 01:00:25.780266823 +0000 UTC m=+382.584484468" lastFinishedPulling="2026-03-11 01:00:47.875034327 +0000 UTC m=+404.679251982" observedRunningTime="2026-03-11 01:00:48.356415685 +0000 UTC m=+405.160633290" watchObservedRunningTime="2026-03-11 01:00:48.360247856 +0000 UTC m=+405.164465471" Mar 11 01:00:49 crc kubenswrapper[4744]: I0311 01:00:49.344238 4744 generic.go:334] "Generic (PLEG): container finished" podID="d246e126-a4f4-41d4-83b1-115cb6f674ec" containerID="3741a4b098ab9f06f8a342cf1d6e93b93d68d1bc46c5ebd60097f1532fc283bd" exitCode=0 Mar 11 01:00:49 crc kubenswrapper[4744]: I0311 01:00:49.344281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553180-89p78" event={"ID":"d246e126-a4f4-41d4-83b1-115cb6f674ec","Type":"ContainerDied","Data":"3741a4b098ab9f06f8a342cf1d6e93b93d68d1bc46c5ebd60097f1532fc283bd"} Mar 11 01:00:50 crc kubenswrapper[4744]: I0311 01:00:50.717625 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:00:50 crc kubenswrapper[4744]: I0311 01:00:50.851958 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wtk\" (UniqueName: \"kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk\") pod \"d246e126-a4f4-41d4-83b1-115cb6f674ec\" (UID: \"d246e126-a4f4-41d4-83b1-115cb6f674ec\") " Mar 11 01:00:50 crc kubenswrapper[4744]: I0311 01:00:50.860759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk" (OuterVolumeSpecName: "kube-api-access-p2wtk") pod "d246e126-a4f4-41d4-83b1-115cb6f674ec" (UID: "d246e126-a4f4-41d4-83b1-115cb6f674ec"). InnerVolumeSpecName "kube-api-access-p2wtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:00:50 crc kubenswrapper[4744]: I0311 01:00:50.953912 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wtk\" (UniqueName: \"kubernetes.io/projected/d246e126-a4f4-41d4-83b1-115cb6f674ec-kube-api-access-p2wtk\") on node \"crc\" DevicePath \"\"" Mar 11 01:00:51 crc kubenswrapper[4744]: I0311 01:00:51.359815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553180-89p78" event={"ID":"d246e126-a4f4-41d4-83b1-115cb6f674ec","Type":"ContainerDied","Data":"32281b7f0e3c5f7bcbd26f4b01bc4086c978a385452bc18684c02028d8ec18a6"} Mar 11 01:00:51 crc kubenswrapper[4744]: I0311 01:00:51.360173 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32281b7f0e3c5f7bcbd26f4b01bc4086c978a385452bc18684c02028d8ec18a6" Mar 11 01:00:51 crc kubenswrapper[4744]: I0311 01:00:51.359871 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553180-89p78" Mar 11 01:01:12 crc kubenswrapper[4744]: I0311 01:01:12.409196 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:01:12 crc kubenswrapper[4744]: I0311 01:01:12.410216 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.818811 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l594w"] Mar 11 01:01:38 crc kubenswrapper[4744]: E0311 01:01:38.819607 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d246e126-a4f4-41d4-83b1-115cb6f674ec" containerName="oc" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819623 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d246e126-a4f4-41d4-83b1-115cb6f674ec" containerName="oc" Mar 11 01:01:38 crc kubenswrapper[4744]: E0311 01:01:38.819634 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819642 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 01:01:38 crc kubenswrapper[4744]: E0311 01:01:38.819651 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" containerName="collect-profiles" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819660 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" containerName="collect-profiles" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819768 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d246e126-a4f4-41d4-83b1-115cb6f674ec" containerName="oc" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819779 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.819792 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" containerName="collect-profiles" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.820196 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.833153 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l594w"] Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.941875 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-tls\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.941949 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xk9\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-kube-api-access-p5xk9\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942039 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942076 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942171 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-bound-sa-token\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-trusted-ca\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.942322 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-certificates\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:38 crc kubenswrapper[4744]: I0311 01:01:38.979059 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043346 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-tls\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xk9\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-kube-api-access-p5xk9\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043600 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043713 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-bound-sa-token\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-trusted-ca\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.043810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-certificates\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.044845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.045966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-trusted-ca\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.046839 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-certificates\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.052382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-registry-tls\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.052744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.063979 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xk9\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-kube-api-access-p5xk9\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.081591 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67831b5f-16ab-4c9a-881b-1ebb6a6876e0-bound-sa-token\") pod \"image-registry-66df7c8f76-l594w\" (UID: \"67831b5f-16ab-4c9a-881b-1ebb6a6876e0\") " pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.135188 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.426824 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l594w"] Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.701682 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" event={"ID":"67831b5f-16ab-4c9a-881b-1ebb6a6876e0","Type":"ContainerStarted","Data":"689afce45fdf18600da3e0c87bd6b578d855ff049a9912cc054ceb33ca75ed90"} Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.702106 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.702127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" event={"ID":"67831b5f-16ab-4c9a-881b-1ebb6a6876e0","Type":"ContainerStarted","Data":"d90aba573adcd590afa37d517bcec271a18be97182f675e16485de3ef1047a3c"} Mar 11 01:01:39 crc kubenswrapper[4744]: I0311 01:01:39.727428 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" podStartSLOduration=1.727400289 podStartE2EDuration="1.727400289s" podCreationTimestamp="2026-03-11 01:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:01:39.722207263 +0000 UTC m=+456.526424878" watchObservedRunningTime="2026-03-11 01:01:39.727400289 +0000 UTC m=+456.531617924" Mar 11 01:01:42 crc kubenswrapper[4744]: I0311 01:01:42.409367 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:01:42 crc kubenswrapper[4744]: I0311 01:01:42.409446 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.829871 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.830835 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n9qr2" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="registry-server" containerID="cri-o://178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c" gracePeriod=30 Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.877325 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.877845 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk7lx" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="registry-server" containerID="cri-o://d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91" gracePeriod=30 Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.890883 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.891162 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" containerID="cri-o://22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6" gracePeriod=30 Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.897579 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.897822 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pfjjz" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="registry-server" containerID="cri-o://d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8" gracePeriod=30 Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.906820 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbjqk"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.907655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.915660 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbjqk"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.918961 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 01:01:56 crc kubenswrapper[4744]: I0311 01:01:56.919201 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wr9vs" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="registry-server" containerID="cri-o://1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d" gracePeriod=30 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.007766 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.007877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v62j\" (UniqueName: \"kubernetes.io/projected/14ecea06-1017-42af-b26b-2859e4f4db7f-kube-api-access-6v62j\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.007948 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.109191 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.109259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v62j\" (UniqueName: \"kubernetes.io/projected/14ecea06-1017-42af-b26b-2859e4f4db7f-kube-api-access-6v62j\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.109293 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.112138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.117112 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/14ecea06-1017-42af-b26b-2859e4f4db7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.129244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v62j\" (UniqueName: \"kubernetes.io/projected/14ecea06-1017-42af-b26b-2859e4f4db7f-kube-api-access-6v62j\") pod \"marketplace-operator-79b997595-mbjqk\" (UID: \"14ecea06-1017-42af-b26b-2859e4f4db7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.298599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.305356 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.311204 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.329781 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.335477 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.383355 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413753 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97n2\" (UniqueName: \"kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2\") pod \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413813 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kctvp\" (UniqueName: \"kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp\") pod \"a870760b-88e5-4526-8f91-ef89201e2a13\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413835 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content\") pod \"9c559a48-ac87-4cab-848d-f2f647f8396b\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413859 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdd2\" (UniqueName: \"kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2\") pod \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413890 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx9dj\" (UniqueName: \"kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj\") pod \"9c559a48-ac87-4cab-848d-f2f647f8396b\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413917 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics\") pod \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca\") pod \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\" (UID: \"3834cb5e-8777-40cb-9a72-75c4d6fb5638\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413967 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities\") pod \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.413992 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities\") pod \"9c559a48-ac87-4cab-848d-f2f647f8396b\" (UID: \"9c559a48-ac87-4cab-848d-f2f647f8396b\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.414030 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content\") pod \"a870760b-88e5-4526-8f91-ef89201e2a13\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.414047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content\") pod \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\" (UID: \"b57e3e22-ee77-4a48-b62a-1a5ff5394362\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.414064 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities\") pod \"a870760b-88e5-4526-8f91-ef89201e2a13\" (UID: \"a870760b-88e5-4526-8f91-ef89201e2a13\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.418580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2" (OuterVolumeSpecName: "kube-api-access-4tdd2") pod "b57e3e22-ee77-4a48-b62a-1a5ff5394362" (UID: "b57e3e22-ee77-4a48-b62a-1a5ff5394362"). InnerVolumeSpecName "kube-api-access-4tdd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.418627 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp" (OuterVolumeSpecName: "kube-api-access-kctvp") pod "a870760b-88e5-4526-8f91-ef89201e2a13" (UID: "a870760b-88e5-4526-8f91-ef89201e2a13"). InnerVolumeSpecName "kube-api-access-kctvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.418742 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3834cb5e-8777-40cb-9a72-75c4d6fb5638" (UID: "3834cb5e-8777-40cb-9a72-75c4d6fb5638"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.419048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2" (OuterVolumeSpecName: "kube-api-access-b97n2") pod "3834cb5e-8777-40cb-9a72-75c4d6fb5638" (UID: "3834cb5e-8777-40cb-9a72-75c4d6fb5638"). InnerVolumeSpecName "kube-api-access-b97n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.419109 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3834cb5e-8777-40cb-9a72-75c4d6fb5638" (UID: "3834cb5e-8777-40cb-9a72-75c4d6fb5638"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.420060 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj" (OuterVolumeSpecName: "kube-api-access-tx9dj") pod "9c559a48-ac87-4cab-848d-f2f647f8396b" (UID: "9c559a48-ac87-4cab-848d-f2f647f8396b"). InnerVolumeSpecName "kube-api-access-tx9dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.420294 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities" (OuterVolumeSpecName: "utilities") pod "9c559a48-ac87-4cab-848d-f2f647f8396b" (UID: "9c559a48-ac87-4cab-848d-f2f647f8396b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.427832 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities" (OuterVolumeSpecName: "utilities") pod "b57e3e22-ee77-4a48-b62a-1a5ff5394362" (UID: "b57e3e22-ee77-4a48-b62a-1a5ff5394362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.430448 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities" (OuterVolumeSpecName: "utilities") pod "a870760b-88e5-4526-8f91-ef89201e2a13" (UID: "a870760b-88e5-4526-8f91-ef89201e2a13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.489578 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a870760b-88e5-4526-8f91-ef89201e2a13" (UID: "a870760b-88e5-4526-8f91-ef89201e2a13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.490140 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c559a48-ac87-4cab-848d-f2f647f8396b" (UID: "9c559a48-ac87-4cab-848d-f2f647f8396b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities\") pod \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515496 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content\") pod \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515587 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drz7\" (UniqueName: \"kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7\") pod \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\" (UID: \"9b7117a9-f857-4f17-a2e8-13bd999e4fe2\") " Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515808 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kctvp\" (UniqueName: \"kubernetes.io/projected/a870760b-88e5-4526-8f91-ef89201e2a13-kube-api-access-kctvp\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515825 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515834 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdd2\" (UniqueName: \"kubernetes.io/projected/b57e3e22-ee77-4a48-b62a-1a5ff5394362-kube-api-access-4tdd2\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515843 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx9dj\" (UniqueName: \"kubernetes.io/projected/9c559a48-ac87-4cab-848d-f2f647f8396b-kube-api-access-tx9dj\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515852 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515861 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3834cb5e-8777-40cb-9a72-75c4d6fb5638-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515868 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515876 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c559a48-ac87-4cab-848d-f2f647f8396b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515884 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515892 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a870760b-88e5-4526-8f91-ef89201e2a13-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.515901 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b97n2\" (UniqueName: \"kubernetes.io/projected/3834cb5e-8777-40cb-9a72-75c4d6fb5638-kube-api-access-b97n2\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.516097 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities" (OuterVolumeSpecName: "utilities") pod "9b7117a9-f857-4f17-a2e8-13bd999e4fe2" (UID: "9b7117a9-f857-4f17-a2e8-13bd999e4fe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.534727 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7" (OuterVolumeSpecName: "kube-api-access-9drz7") pod "9b7117a9-f857-4f17-a2e8-13bd999e4fe2" (UID: "9b7117a9-f857-4f17-a2e8-13bd999e4fe2"). InnerVolumeSpecName "kube-api-access-9drz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.539780 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7117a9-f857-4f17-a2e8-13bd999e4fe2" (UID: "9b7117a9-f857-4f17-a2e8-13bd999e4fe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.555397 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b57e3e22-ee77-4a48-b62a-1a5ff5394362" (UID: "b57e3e22-ee77-4a48-b62a-1a5ff5394362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.617096 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drz7\" (UniqueName: \"kubernetes.io/projected/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-kube-api-access-9drz7\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.617136 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.617151 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57e3e22-ee77-4a48-b62a-1a5ff5394362-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.617162 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7117a9-f857-4f17-a2e8-13bd999e4fe2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.736778 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbjqk"] Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.865854 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerID="d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8" exitCode=0 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.865917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerDied","Data":"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.865981 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfjjz" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.866361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfjjz" event={"ID":"9b7117a9-f857-4f17-a2e8-13bd999e4fe2","Type":"ContainerDied","Data":"1af9a53f908bfb9cc7404a73ae630bb10f87ed42c63b3e881e282449ebc0e473"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.866408 4744 scope.go:117] "RemoveContainer" containerID="d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.877373 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerID="d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91" exitCode=0 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.877467 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7lx" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.877550 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerDied","Data":"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.877609 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7lx" event={"ID":"9c559a48-ac87-4cab-848d-f2f647f8396b","Type":"ContainerDied","Data":"2c0ee2ddde362c1eb039f7a433a35d9f76ff924ff3dcc975c503639371f62178"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.880177 4744 generic.go:334] "Generic (PLEG): container finished" podID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerID="22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6" exitCode=0 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.880234 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.880255 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" event={"ID":"3834cb5e-8777-40cb-9a72-75c4d6fb5638","Type":"ContainerDied","Data":"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.880318 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k6qjd" event={"ID":"3834cb5e-8777-40cb-9a72-75c4d6fb5638","Type":"ContainerDied","Data":"20a9cb47ecdd46ca355b6e393cc5ecd6784ef9253e63f71224493b2c3e126a25"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.884873 4744 generic.go:334] "Generic (PLEG): container finished" podID="a870760b-88e5-4526-8f91-ef89201e2a13" containerID="178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c" exitCode=0 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.884954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerDied","Data":"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.884990 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9qr2" event={"ID":"a870760b-88e5-4526-8f91-ef89201e2a13","Type":"ContainerDied","Data":"3a13bf641beff9c011ba04f462dfb99d16ccb2ab1b0cefb982fcb7a0f0aaadd5"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.885076 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9qr2" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.892405 4744 scope.go:117] "RemoveContainer" containerID="e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.892572 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" event={"ID":"14ecea06-1017-42af-b26b-2859e4f4db7f","Type":"ContainerStarted","Data":"dc5cadfbbc44208f87f3f485c5c4cf089c10b26bb1d264646c888625f837c719"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.900451 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerID="1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d" exitCode=0 Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.900485 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerDied","Data":"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.900536 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr9vs" event={"ID":"b57e3e22-ee77-4a48-b62a-1a5ff5394362","Type":"ContainerDied","Data":"d5211369217e613b95f128fddef57146760818d2d3c920a464cf55cf0910a3cd"} Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.900538 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr9vs" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.926007 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.933654 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfjjz"] Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.937596 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.938727 4744 scope.go:117] "RemoveContainer" containerID="9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.946332 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk7lx"] Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.958338 4744 scope.go:117] "RemoveContainer" containerID="d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8" Mar 11 01:01:57 crc kubenswrapper[4744]: E0311 01:01:57.958951 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8\": container with ID starting with d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8 not found: ID does not exist" containerID="d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.958995 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8"} err="failed to get container status \"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8\": rpc error: code = NotFound desc = could not find container \"d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8\": container with ID starting with d67f0f6e8bfeebb833b490f9ee4451b24587ed964289fc769c049cfb42a6eda8 not found: ID does not exist" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.959032 4744 scope.go:117] "RemoveContainer" containerID="e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4" Mar 11 01:01:57 crc kubenswrapper[4744]: E0311 01:01:57.959964 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4\": container with ID starting with e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4 not found: ID does not exist" containerID="e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.960048 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4"} err="failed to get container status \"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4\": rpc error: code = NotFound desc = could not find container \"e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4\": container with ID starting with e5bd59cef7318c1e9fc1636da0ce90cc03482102976abc45e7ebaa68ee9960c4 not found: ID does not exist" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.960089 4744 scope.go:117] "RemoveContainer" containerID="9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1" Mar 11 01:01:57 crc kubenswrapper[4744]: E0311 01:01:57.960949 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1\": container with ID starting with 9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1 not found: ID does not exist" containerID="9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.961023 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1"} err="failed to get container status \"9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1\": rpc error: code = NotFound desc = could not find container \"9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1\": container with ID starting with 9e8b7db773aaeb11369a4fdc1ed5dd8aa8678f7ee5ff70b758e217dbc800e9b1 not found: ID does not exist" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.961061 4744 scope.go:117] "RemoveContainer" containerID="d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91" Mar 11 01:01:57 crc kubenswrapper[4744]: I0311 01:01:57.998745 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" path="/var/lib/kubelet/pods/9b7117a9-f857-4f17-a2e8-13bd999e4fe2/volumes" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.002129 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" path="/var/lib/kubelet/pods/9c559a48-ac87-4cab-848d-f2f647f8396b/volumes" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.002755 4744 scope.go:117] "RemoveContainer" containerID="f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003044 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003156 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k6qjd"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003251 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003345 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wr9vs"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003451 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.003561 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n9qr2"] Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.018068 4744 scope.go:117] "RemoveContainer" containerID="a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.036817 4744 scope.go:117] "RemoveContainer" containerID="d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.037575 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91\": container with ID starting with d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91 not found: ID does not exist" containerID="d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.037604 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91"} err="failed to get container status \"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91\": rpc error: code = NotFound desc = could not find container \"d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91\": container with ID starting with d4d3fb4bb84b3def82fa38eb030f935e21e6a5a538100ba085ab3ef0c4bd0d91 not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.037626 4744 scope.go:117] "RemoveContainer" containerID="f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.038107 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef\": container with ID starting with f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef not found: ID does not exist" containerID="f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.038240 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef"} err="failed to get container status \"f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef\": rpc error: code = NotFound desc = could not find container \"f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef\": container with ID starting with f3bc07a72311c21e3f1810c1662b7a11d908ebce4207cf275097752baa2138ef not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.038360 4744 scope.go:117] "RemoveContainer" containerID="a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.038734 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea\": container with ID starting with a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea not found: ID does not exist" containerID="a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.038760 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea"} err="failed to get container status \"a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea\": rpc error: code = NotFound desc = could not find container \"a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea\": container with ID starting with a5f7eba49add5482c30c4ab4ff51c226cc5c3b4200c906eeba3263a45b164bea not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.038773 4744 scope.go:117] "RemoveContainer" containerID="22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.050073 4744 scope.go:117] "RemoveContainer" containerID="22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.050558 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6\": container with ID starting with 22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6 not found: ID does not exist" containerID="22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.050586 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6"} err="failed to get container status \"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6\": rpc error: code = NotFound desc = could not find container \"22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6\": container with ID starting with 22739c3d7ec61b0021dfcc7c27c696c6baf34286150f9cc1160ce673f1eaa0c6 not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.050603 4744 scope.go:117] "RemoveContainer" containerID="178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.069768 4744 scope.go:117] "RemoveContainer" containerID="76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.084074 4744 scope.go:117] "RemoveContainer" containerID="60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.101866 4744 scope.go:117] "RemoveContainer" containerID="178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.102151 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c\": container with ID starting with 178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c not found: ID does not exist" containerID="178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.102177 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c"} err="failed to get container status \"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c\": rpc error: code = NotFound desc = could not find container \"178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c\": container with ID starting with 178240e20dab3856c62382a7c28316fbefb796cf3563e16a8a442385c705f64c not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.102198 4744 scope.go:117] "RemoveContainer" containerID="76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.103377 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c\": container with ID starting with 76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c not found: ID does not exist" containerID="76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.103489 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c"} err="failed to get container status \"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c\": rpc error: code = NotFound desc = could not find container \"76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c\": container with ID starting with 76bbf0d81633fc0743ac72563b97b34cc469b656a84794d503b82ed5223e9d0c not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.103601 4744 scope.go:117] "RemoveContainer" containerID="60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.104020 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811\": container with ID starting with 60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811 not found: ID does not exist" containerID="60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.104105 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811"} err="failed to get container status \"60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811\": rpc error: code = NotFound desc = could not find container \"60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811\": container with ID starting with 60c38e06b2f2dc61dd76556c58a00114b591caf5bdfc08fc8f1f2c02180d8811 not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.104169 4744 scope.go:117] "RemoveContainer" containerID="1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.116937 4744 scope.go:117] "RemoveContainer" containerID="90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.130493 4744 scope.go:117] "RemoveContainer" containerID="5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.148708 4744 scope.go:117] "RemoveContainer" containerID="1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.149058 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d\": container with ID starting with 1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d not found: ID does not exist" containerID="1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.149098 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d"} err="failed to get container status \"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d\": rpc error: code = NotFound desc = could not find container \"1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d\": container with ID starting with 1ab9d6584560076f288dfa56b28d5eabe20ebe8ff95425e792784058c0b8232d not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.149128 4744 scope.go:117] "RemoveContainer" containerID="90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.149412 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2\": container with ID starting with 90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2 not found: ID does not exist" containerID="90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.149442 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2"} err="failed to get container status \"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2\": rpc error: code = NotFound desc = could not find container \"90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2\": container with ID starting with 90c8abcdc6ae4516a76318070f74e18a838b1f10028b2809123d2b48042159f2 not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.149463 4744 scope.go:117] "RemoveContainer" containerID="5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9" Mar 11 01:01:58 crc kubenswrapper[4744]: E0311 01:01:58.149764 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9\": container with ID starting with 5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9 not found: ID does not exist" containerID="5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.149862 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9"} err="failed to get container status \"5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9\": rpc error: code = NotFound desc = could not find container \"5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9\": container with ID starting with 5db7c2e3fdb66779881687aab5abe03def61d9a88793adc84fdd9eeb9ece9ad9 not found: ID does not exist" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.917701 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" event={"ID":"14ecea06-1017-42af-b26b-2859e4f4db7f","Type":"ContainerStarted","Data":"25e42d26fcbd3011f825f03a84aa8a692d90c8b5b23639fc658b2cfce52341cf"} Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.918101 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.920326 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" Mar 11 01:01:58 crc kubenswrapper[4744]: I0311 01:01:58.936891 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mbjqk" podStartSLOduration=2.936862894 podStartE2EDuration="2.936862894s" podCreationTimestamp="2026-03-11 01:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:01:58.935432347 +0000 UTC m=+475.739649982" watchObservedRunningTime="2026-03-11 01:01:58.936862894 +0000 UTC m=+475.741080499" Mar 11 01:01:59 crc kubenswrapper[4744]: I0311 01:01:59.148960 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l594w" Mar 11 01:01:59 crc kubenswrapper[4744]: I0311 01:01:59.204914 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 01:01:59 crc kubenswrapper[4744]: I0311 01:01:59.990111 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" path="/var/lib/kubelet/pods/3834cb5e-8777-40cb-9a72-75c4d6fb5638/volumes" Mar 11 01:01:59 crc kubenswrapper[4744]: I0311 01:01:59.990744 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" path="/var/lib/kubelet/pods/a870760b-88e5-4526-8f91-ef89201e2a13/volumes" Mar 11 01:01:59 crc kubenswrapper[4744]: I0311 01:01:59.991480 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" path="/var/lib/kubelet/pods/b57e3e22-ee77-4a48-b62a-1a5ff5394362/volumes" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042232 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4z9p9"] Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042624 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042655 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042674 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042686 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042708 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042720 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042738 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042751 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042768 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042779 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042796 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042808 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042826 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042838 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042856 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042869 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042884 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042896 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042920 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042932 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042949 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042961 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.042978 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.042993 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="extract-content" Mar 11 01:02:00 crc kubenswrapper[4744]: E0311 01:02:00.043006 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043018 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="extract-utilities" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043188 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57e3e22-ee77-4a48-b62a-1a5ff5394362" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043210 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a870760b-88e5-4526-8f91-ef89201e2a13" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043229 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c559a48-ac87-4cab-848d-f2f647f8396b" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043247 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3834cb5e-8777-40cb-9a72-75c4d6fb5638" containerName="marketplace-operator" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.043268 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7117a9-f857-4f17-a2e8-13bd999e4fe2" containerName="registry-server" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.044509 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.046988 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.049622 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z9p9"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.131988 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553182-pvwjp"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.132573 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.134798 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.134861 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.137296 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.148067 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553182-pvwjp"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.158268 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72sp\" (UniqueName: \"kubernetes.io/projected/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-kube-api-access-d72sp\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.158315 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ll7\" (UniqueName: \"kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7\") pod \"auto-csr-approver-29553182-pvwjp\" (UID: \"ca8c60ea-f39f-4095-b939-f2723480055f\") " pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.158396 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-utilities\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.158439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-catalog-content\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.237113 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9h79z"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.238336 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.242252 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.248059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9h79z"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-utilities\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-catalog-content\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259771 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-catalog-content\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxpwm\" (UniqueName: \"kubernetes.io/projected/3b2fa563-23f0-4670-a9db-c24f901242ba-kube-api-access-lxpwm\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259820 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-utilities\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259841 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72sp\" (UniqueName: \"kubernetes.io/projected/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-kube-api-access-d72sp\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.259857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ll7\" (UniqueName: \"kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7\") pod \"auto-csr-approver-29553182-pvwjp\" (UID: \"ca8c60ea-f39f-4095-b939-f2723480055f\") " pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.260466 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-utilities\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.260692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-catalog-content\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.283877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72sp\" (UniqueName: \"kubernetes.io/projected/7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24-kube-api-access-d72sp\") pod \"redhat-marketplace-4z9p9\" (UID: \"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24\") " pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.289001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ll7\" (UniqueName: \"kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7\") pod \"auto-csr-approver-29553182-pvwjp\" (UID: \"ca8c60ea-f39f-4095-b939-f2723480055f\") " pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.360549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-catalog-content\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.360596 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxpwm\" (UniqueName: \"kubernetes.io/projected/3b2fa563-23f0-4670-a9db-c24f901242ba-kube-api-access-lxpwm\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.360634 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-utilities\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.361040 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-catalog-content\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.361076 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2fa563-23f0-4670-a9db-c24f901242ba-utilities\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.367820 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.379597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxpwm\" (UniqueName: \"kubernetes.io/projected/3b2fa563-23f0-4670-a9db-c24f901242ba-kube-api-access-lxpwm\") pod \"redhat-operators-9h79z\" (UID: \"3b2fa563-23f0-4670-a9db-c24f901242ba\") " pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.468950 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.614059 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.631013 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z9p9"] Mar 11 01:02:00 crc kubenswrapper[4744]: W0311 01:02:00.637166 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea4c0d9_6a60_4dbf_b768_4a2c39b36f24.slice/crio-f7cefa4792db4c8aa4439ec208fadc1aaa0024ffc58e36e367ef217b63dbbda2 WatchSource:0}: Error finding container f7cefa4792db4c8aa4439ec208fadc1aaa0024ffc58e36e367ef217b63dbbda2: Status 404 returned error can't find the container with id f7cefa4792db4c8aa4439ec208fadc1aaa0024ffc58e36e367ef217b63dbbda2 Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.677876 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553182-pvwjp"] Mar 11 01:02:00 crc kubenswrapper[4744]: W0311 01:02:00.694571 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca8c60ea_f39f_4095_b939_f2723480055f.slice/crio-d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4 WatchSource:0}: Error finding container d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4: Status 404 returned error can't find the container with id d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4 Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.811206 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9h79z"] Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.941710 4744 generic.go:334] "Generic (PLEG): container finished" podID="7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24" containerID="011c734b449c57265020ae91862f0c4e2a4e5549222f339c2bf289c29493a805" exitCode=0 Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.941820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z9p9" event={"ID":"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24","Type":"ContainerDied","Data":"011c734b449c57265020ae91862f0c4e2a4e5549222f339c2bf289c29493a805"} Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.942120 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z9p9" event={"ID":"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24","Type":"ContainerStarted","Data":"f7cefa4792db4c8aa4439ec208fadc1aaa0024ffc58e36e367ef217b63dbbda2"} Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.946213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" event={"ID":"ca8c60ea-f39f-4095-b939-f2723480055f","Type":"ContainerStarted","Data":"d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4"} Mar 11 01:02:00 crc kubenswrapper[4744]: I0311 01:02:00.951718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9h79z" event={"ID":"3b2fa563-23f0-4670-a9db-c24f901242ba","Type":"ContainerStarted","Data":"bc07352af7f71ecc13e7b0e6e933983ba69376ff394d215465972dc44578c54c"} Mar 11 01:02:01 crc kubenswrapper[4744]: I0311 01:02:01.959438 4744 generic.go:334] "Generic (PLEG): container finished" podID="3b2fa563-23f0-4670-a9db-c24f901242ba" containerID="4c1097b1d90316e86be97deb83e4874922fea55ac61bb88e85166445f4eb32a7" exitCode=0 Mar 11 01:02:01 crc kubenswrapper[4744]: I0311 01:02:01.959497 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9h79z" event={"ID":"3b2fa563-23f0-4670-a9db-c24f901242ba","Type":"ContainerDied","Data":"4c1097b1d90316e86be97deb83e4874922fea55ac61bb88e85166445f4eb32a7"} Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.448685 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ng6wp"] Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.451283 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.455467 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.459803 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng6wp"] Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.485970 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xl8\" (UniqueName: \"kubernetes.io/projected/34fd0e84-9ac8-4c64-94e2-9e774f709cda-kube-api-access-v2xl8\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.486284 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-utilities\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.486369 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-catalog-content\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.587328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-utilities\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.587365 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-catalog-content\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.587394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2xl8\" (UniqueName: \"kubernetes.io/projected/34fd0e84-9ac8-4c64-94e2-9e774f709cda-kube-api-access-v2xl8\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.588155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-utilities\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.588189 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34fd0e84-9ac8-4c64-94e2-9e774f709cda-catalog-content\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.623331 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2xl8\" (UniqueName: \"kubernetes.io/projected/34fd0e84-9ac8-4c64-94e2-9e774f709cda-kube-api-access-v2xl8\") pod \"community-operators-ng6wp\" (UID: \"34fd0e84-9ac8-4c64-94e2-9e774f709cda\") " pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.637822 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tf2dx"] Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.639378 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.648090 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf2dx"] Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.649024 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.688351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-catalog-content\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.688403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-utilities\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.688437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcmh\" (UniqueName: \"kubernetes.io/projected/62146385-9b56-4dcc-9698-f63685b49374-kube-api-access-qbcmh\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.780481 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.789323 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcmh\" (UniqueName: \"kubernetes.io/projected/62146385-9b56-4dcc-9698-f63685b49374-kube-api-access-qbcmh\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.789393 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-catalog-content\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.789435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-utilities\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.790271 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-utilities\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.790369 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62146385-9b56-4dcc-9698-f63685b49374-catalog-content\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.810862 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcmh\" (UniqueName: \"kubernetes.io/projected/62146385-9b56-4dcc-9698-f63685b49374-kube-api-access-qbcmh\") pod \"certified-operators-tf2dx\" (UID: \"62146385-9b56-4dcc-9698-f63685b49374\") " pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.968288 4744 generic.go:334] "Generic (PLEG): container finished" podID="7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24" containerID="2a0aa129d44c275ee3cbee34be725eabaa1bdd51453b2a1a9eb2b88dc8dd76c9" exitCode=0 Mar 11 01:02:02 crc kubenswrapper[4744]: I0311 01:02:02.968457 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z9p9" event={"ID":"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24","Type":"ContainerDied","Data":"2a0aa129d44c275ee3cbee34be725eabaa1bdd51453b2a1a9eb2b88dc8dd76c9"} Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.011940 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng6wp"] Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.068654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.251208 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf2dx"] Mar 11 01:02:03 crc kubenswrapper[4744]: W0311 01:02:03.307351 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62146385_9b56_4dcc_9698_f63685b49374.slice/crio-790bed110d481362314a97d520507d9d65f653550b34c1246e371cd466b96104 WatchSource:0}: Error finding container 790bed110d481362314a97d520507d9d65f653550b34c1246e371cd466b96104: Status 404 returned error can't find the container with id 790bed110d481362314a97d520507d9d65f653550b34c1246e371cd466b96104 Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.980491 4744 generic.go:334] "Generic (PLEG): container finished" podID="62146385-9b56-4dcc-9698-f63685b49374" containerID="953c9c91c2c218b9f959d7ddb6a32e345f2f833999cf40d7810819568382dd01" exitCode=0 Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.988360 4744 generic.go:334] "Generic (PLEG): container finished" podID="34fd0e84-9ac8-4c64-94e2-9e774f709cda" containerID="1c762ba2fb50be95c57267cb0c79b8f711ec2f4682d70695bda6d869c57ee4c2" exitCode=0 Mar 11 01:02:03 crc kubenswrapper[4744]: I0311 01:02:03.995980 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca8c60ea-f39f-4095-b939-f2723480055f" containerID="68572562ea8e88035d1531ad81c9551532026ffa6088df3126f7f356cf4f8adb" exitCode=0 Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999583 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf2dx" event={"ID":"62146385-9b56-4dcc-9698-f63685b49374","Type":"ContainerDied","Data":"953c9c91c2c218b9f959d7ddb6a32e345f2f833999cf40d7810819568382dd01"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999615 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf2dx" event={"ID":"62146385-9b56-4dcc-9698-f63685b49374","Type":"ContainerStarted","Data":"790bed110d481362314a97d520507d9d65f653550b34c1246e371cd466b96104"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9h79z" event={"ID":"3b2fa563-23f0-4670-a9db-c24f901242ba","Type":"ContainerStarted","Data":"d73704f7c1423d184e0819fb26f5caa737c6d43ccacb048d26a4fd2b61d8f10c"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999644 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng6wp" event={"ID":"34fd0e84-9ac8-4c64-94e2-9e774f709cda","Type":"ContainerDied","Data":"1c762ba2fb50be95c57267cb0c79b8f711ec2f4682d70695bda6d869c57ee4c2"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999653 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng6wp" event={"ID":"34fd0e84-9ac8-4c64-94e2-9e774f709cda","Type":"ContainerStarted","Data":"1d0d383adf44e25284fda364f8eb8d2cb6e0c37c2f940f66444771d45c93c389"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999663 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z9p9" event={"ID":"7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24","Type":"ContainerStarted","Data":"5c6002835aef29d52416e6da142d72874cbcdd9e9b29952ed32e328deec4f950"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:03.999673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" event={"ID":"ca8c60ea-f39f-4095-b939-f2723480055f","Type":"ContainerDied","Data":"68572562ea8e88035d1531ad81c9551532026ffa6088df3126f7f356cf4f8adb"} Mar 11 01:02:04 crc kubenswrapper[4744]: I0311 01:02:04.073892 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4z9p9" podStartSLOduration=1.633368206 podStartE2EDuration="4.073874758s" podCreationTimestamp="2026-03-11 01:02:00 +0000 UTC" firstStartedPulling="2026-03-11 01:02:00.943211802 +0000 UTC m=+477.747429407" lastFinishedPulling="2026-03-11 01:02:03.383718354 +0000 UTC m=+480.187935959" observedRunningTime="2026-03-11 01:02:04.070348792 +0000 UTC m=+480.874566387" watchObservedRunningTime="2026-03-11 01:02:04.073874758 +0000 UTC m=+480.878092373" Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.003149 4744 generic.go:334] "Generic (PLEG): container finished" podID="3b2fa563-23f0-4670-a9db-c24f901242ba" containerID="d73704f7c1423d184e0819fb26f5caa737c6d43ccacb048d26a4fd2b61d8f10c" exitCode=0 Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.004302 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9h79z" event={"ID":"3b2fa563-23f0-4670-a9db-c24f901242ba","Type":"ContainerDied","Data":"d73704f7c1423d184e0819fb26f5caa737c6d43ccacb048d26a4fd2b61d8f10c"} Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.314119 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.424766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ll7\" (UniqueName: \"kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7\") pod \"ca8c60ea-f39f-4095-b939-f2723480055f\" (UID: \"ca8c60ea-f39f-4095-b939-f2723480055f\") " Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.429192 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7" (OuterVolumeSpecName: "kube-api-access-j6ll7") pod "ca8c60ea-f39f-4095-b939-f2723480055f" (UID: "ca8c60ea-f39f-4095-b939-f2723480055f"). InnerVolumeSpecName "kube-api-access-j6ll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:02:05 crc kubenswrapper[4744]: I0311 01:02:05.526483 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6ll7\" (UniqueName: \"kubernetes.io/projected/ca8c60ea-f39f-4095-b939-f2723480055f-kube-api-access-j6ll7\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.019829 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9h79z" event={"ID":"3b2fa563-23f0-4670-a9db-c24f901242ba","Type":"ContainerStarted","Data":"52a48f3b56e3d2415451ad95835b6bb7acaa3d6851d124ba1a6efd04cd909153"} Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.023642 4744 generic.go:334] "Generic (PLEG): container finished" podID="34fd0e84-9ac8-4c64-94e2-9e774f709cda" containerID="11bdb67b52f9057015ab2504a6a3fdc7513c5533e1e84ce61bf897e99caccf49" exitCode=0 Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.023710 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng6wp" event={"ID":"34fd0e84-9ac8-4c64-94e2-9e774f709cda","Type":"ContainerDied","Data":"11bdb67b52f9057015ab2504a6a3fdc7513c5533e1e84ce61bf897e99caccf49"} Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.026593 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.027005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553182-pvwjp" event={"ID":"ca8c60ea-f39f-4095-b939-f2723480055f","Type":"ContainerDied","Data":"d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4"} Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.027025 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d20a2ce2aeabb156dcbec9559e0e65e6cb4ddd41e00e3f62e222bbb7b101bef4" Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.034507 4744 generic.go:334] "Generic (PLEG): container finished" podID="62146385-9b56-4dcc-9698-f63685b49374" containerID="c385d48ead3c29fdfbb8d4cf0155265ab032794ab795689693a87fa4c0646454" exitCode=0 Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.034565 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf2dx" event={"ID":"62146385-9b56-4dcc-9698-f63685b49374","Type":"ContainerDied","Data":"c385d48ead3c29fdfbb8d4cf0155265ab032794ab795689693a87fa4c0646454"} Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.049046 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9h79z" podStartSLOduration=2.582899104 podStartE2EDuration="6.04903037s" podCreationTimestamp="2026-03-11 01:02:00 +0000 UTC" firstStartedPulling="2026-03-11 01:02:01.961980328 +0000 UTC m=+478.766197963" lastFinishedPulling="2026-03-11 01:02:05.428111624 +0000 UTC m=+482.232329229" observedRunningTime="2026-03-11 01:02:06.043208927 +0000 UTC m=+482.847426542" watchObservedRunningTime="2026-03-11 01:02:06.04903037 +0000 UTC m=+482.853247985" Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.380837 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553176-m7tmh"] Mar 11 01:02:06 crc kubenswrapper[4744]: I0311 01:02:06.387830 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553176-m7tmh"] Mar 11 01:02:07 crc kubenswrapper[4744]: I0311 01:02:07.046440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf2dx" event={"ID":"62146385-9b56-4dcc-9698-f63685b49374","Type":"ContainerStarted","Data":"98696e63b0cc083b239d990e167e7f60a7481c6eecf51b3a7b48cb17d9d0e4c1"} Mar 11 01:02:07 crc kubenswrapper[4744]: I0311 01:02:07.074788 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tf2dx" podStartSLOduration=2.611738277 podStartE2EDuration="5.074773647s" podCreationTimestamp="2026-03-11 01:02:02 +0000 UTC" firstStartedPulling="2026-03-11 01:02:04.001768444 +0000 UTC m=+480.805986069" lastFinishedPulling="2026-03-11 01:02:06.464803794 +0000 UTC m=+483.269021439" observedRunningTime="2026-03-11 01:02:07.071288242 +0000 UTC m=+483.875505857" watchObservedRunningTime="2026-03-11 01:02:07.074773647 +0000 UTC m=+483.878991262" Mar 11 01:02:07 crc kubenswrapper[4744]: I0311 01:02:07.987713 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c454621-190e-4962-abed-72c0ec0613de" path="/var/lib/kubelet/pods/7c454621-190e-4962-abed-72c0ec0613de/volumes" Mar 11 01:02:10 crc kubenswrapper[4744]: I0311 01:02:10.368624 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:10 crc kubenswrapper[4744]: I0311 01:02:10.369068 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:10 crc kubenswrapper[4744]: I0311 01:02:10.446505 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:10 crc kubenswrapper[4744]: I0311 01:02:10.615500 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:10 crc kubenswrapper[4744]: I0311 01:02:10.615560 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:11 crc kubenswrapper[4744]: I0311 01:02:11.148694 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4z9p9" Mar 11 01:02:11 crc kubenswrapper[4744]: I0311 01:02:11.677242 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9h79z" podUID="3b2fa563-23f0-4670-a9db-c24f901242ba" containerName="registry-server" probeResult="failure" output=< Mar 11 01:02:11 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:02:11 crc kubenswrapper[4744]: > Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.079319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng6wp" event={"ID":"34fd0e84-9ac8-4c64-94e2-9e774f709cda","Type":"ContainerStarted","Data":"5620f79b91f68371f3b547f8bdf212204eb27afaa10807744d05cf6e3be4793d"} Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.408889 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.408967 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.409052 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.409819 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.409894 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732" gracePeriod=600 Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.781204 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:12 crc kubenswrapper[4744]: I0311 01:02:12.781699 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.069348 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.069414 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.090821 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732" exitCode=0 Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.091773 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732"} Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.091816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80"} Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.091844 4744 scope.go:117] "RemoveContainer" containerID="9d5c92e7037925f24a5b88f538d4822fae94b65ea5f07960f3e465148975af4e" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.113714 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ng6wp" podStartSLOduration=4.21288262 podStartE2EDuration="11.113694227s" podCreationTimestamp="2026-03-11 01:02:02 +0000 UTC" firstStartedPulling="2026-03-11 01:02:04.001276668 +0000 UTC m=+480.805494333" lastFinishedPulling="2026-03-11 01:02:10.902088335 +0000 UTC m=+487.706305940" observedRunningTime="2026-03-11 01:02:12.106670441 +0000 UTC m=+488.910888126" watchObservedRunningTime="2026-03-11 01:02:13.113694227 +0000 UTC m=+489.917911842" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.118376 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:13 crc kubenswrapper[4744]: I0311 01:02:13.837086 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ng6wp" podUID="34fd0e84-9ac8-4c64-94e2-9e774f709cda" containerName="registry-server" probeResult="failure" output=< Mar 11 01:02:13 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:02:13 crc kubenswrapper[4744]: > Mar 11 01:02:14 crc kubenswrapper[4744]: I0311 01:02:14.156113 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tf2dx" Mar 11 01:02:20 crc kubenswrapper[4744]: I0311 01:02:20.679194 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:20 crc kubenswrapper[4744]: I0311 01:02:20.754901 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9h79z" Mar 11 01:02:22 crc kubenswrapper[4744]: I0311 01:02:22.837071 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:22 crc kubenswrapper[4744]: I0311 01:02:22.899948 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ng6wp" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.258072 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" podUID="97b9dce4-e1bd-400e-a4c2-848e9703db45" containerName="registry" containerID="cri-o://d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d" gracePeriod=30 Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.642484 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752418 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752467 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752543 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh2ck\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752599 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752820 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752869 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.752903 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.753047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls\") pod \"97b9dce4-e1bd-400e-a4c2-848e9703db45\" (UID: \"97b9dce4-e1bd-400e-a4c2-848e9703db45\") " Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.753615 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.753998 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.760500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.761865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.765998 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.766338 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck" (OuterVolumeSpecName: "kube-api-access-lh2ck") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "kube-api-access-lh2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769234 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769260 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769273 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b9dce4-e1bd-400e-a4c2-848e9703db45-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769288 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769300 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh2ck\" (UniqueName: \"kubernetes.io/projected/97b9dce4-e1bd-400e-a4c2-848e9703db45-kube-api-access-lh2ck\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769311 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b9dce4-e1bd-400e-a4c2-848e9703db45-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.769719 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.770116 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "97b9dce4-e1bd-400e-a4c2-848e9703db45" (UID: "97b9dce4-e1bd-400e-a4c2-848e9703db45"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 01:02:24 crc kubenswrapper[4744]: I0311 01:02:24.870174 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b9dce4-e1bd-400e-a4c2-848e9703db45-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.172012 4744 generic.go:334] "Generic (PLEG): container finished" podID="97b9dce4-e1bd-400e-a4c2-848e9703db45" containerID="d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d" exitCode=0 Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.172062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" event={"ID":"97b9dce4-e1bd-400e-a4c2-848e9703db45","Type":"ContainerDied","Data":"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d"} Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.172087 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" event={"ID":"97b9dce4-e1bd-400e-a4c2-848e9703db45","Type":"ContainerDied","Data":"1e33b8f987bb93b408f4eb2457682589f6a79902d41a193b38f0639969f25e4f"} Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.172104 4744 scope.go:117] "RemoveContainer" containerID="d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d" Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.172195 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg2gg" Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.194368 4744 scope.go:117] "RemoveContainer" containerID="d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d" Mar 11 01:02:25 crc kubenswrapper[4744]: E0311 01:02:25.194993 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d\": container with ID starting with d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d not found: ID does not exist" containerID="d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d" Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.195036 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d"} err="failed to get container status \"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d\": rpc error: code = NotFound desc = could not find container \"d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d\": container with ID starting with d8dea71970254c298c0b408453f7d301b1cf2f85bdfa1afdce6f1492e5ba261d not found: ID does not exist" Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.199001 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.201734 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg2gg"] Mar 11 01:02:25 crc kubenswrapper[4744]: I0311 01:02:25.980696 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b9dce4-e1bd-400e-a4c2-848e9703db45" path="/var/lib/kubelet/pods/97b9dce4-e1bd-400e-a4c2-848e9703db45/volumes" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.147185 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553184-gvb9b"] Mar 11 01:04:00 crc kubenswrapper[4744]: E0311 01:04:00.148100 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b9dce4-e1bd-400e-a4c2-848e9703db45" containerName="registry" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.148123 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b9dce4-e1bd-400e-a4c2-848e9703db45" containerName="registry" Mar 11 01:04:00 crc kubenswrapper[4744]: E0311 01:04:00.148152 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8c60ea-f39f-4095-b939-f2723480055f" containerName="oc" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.148166 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8c60ea-f39f-4095-b939-f2723480055f" containerName="oc" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.148359 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca8c60ea-f39f-4095-b939-f2723480055f" containerName="oc" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.148383 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b9dce4-e1bd-400e-a4c2-848e9703db45" containerName="registry" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.149083 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.151706 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.153015 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.153980 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.155546 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553184-gvb9b"] Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.251128 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgjm\" (UniqueName: \"kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm\") pod \"auto-csr-approver-29553184-gvb9b\" (UID: \"a9acbc58-725d-49da-8110-54a476725dbc\") " pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.351983 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgjm\" (UniqueName: \"kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm\") pod \"auto-csr-approver-29553184-gvb9b\" (UID: \"a9acbc58-725d-49da-8110-54a476725dbc\") " pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.385055 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgjm\" (UniqueName: \"kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm\") pod \"auto-csr-approver-29553184-gvb9b\" (UID: \"a9acbc58-725d-49da-8110-54a476725dbc\") " pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:00 crc kubenswrapper[4744]: I0311 01:04:00.501408 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:01 crc kubenswrapper[4744]: I0311 01:04:01.353090 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553184-gvb9b"] Mar 11 01:04:01 crc kubenswrapper[4744]: W0311 01:04:01.359776 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9acbc58_725d_49da_8110_54a476725dbc.slice/crio-eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5 WatchSource:0}: Error finding container eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5: Status 404 returned error can't find the container with id eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5 Mar 11 01:04:01 crc kubenswrapper[4744]: I0311 01:04:01.364639 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:04:01 crc kubenswrapper[4744]: I0311 01:04:01.852129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" event={"ID":"a9acbc58-725d-49da-8110-54a476725dbc","Type":"ContainerStarted","Data":"eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5"} Mar 11 01:04:03 crc kubenswrapper[4744]: I0311 01:04:03.870925 4744 generic.go:334] "Generic (PLEG): container finished" podID="a9acbc58-725d-49da-8110-54a476725dbc" containerID="679ec66311498cc17d79de9a1a18a3ff6aefef33eb1f688889b4239fd05ebe7f" exitCode=0 Mar 11 01:04:03 crc kubenswrapper[4744]: I0311 01:04:03.871012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" event={"ID":"a9acbc58-725d-49da-8110-54a476725dbc","Type":"ContainerDied","Data":"679ec66311498cc17d79de9a1a18a3ff6aefef33eb1f688889b4239fd05ebe7f"} Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.103143 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.223572 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgjm\" (UniqueName: \"kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm\") pod \"a9acbc58-725d-49da-8110-54a476725dbc\" (UID: \"a9acbc58-725d-49da-8110-54a476725dbc\") " Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.230912 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm" (OuterVolumeSpecName: "kube-api-access-rkgjm") pod "a9acbc58-725d-49da-8110-54a476725dbc" (UID: "a9acbc58-725d-49da-8110-54a476725dbc"). InnerVolumeSpecName "kube-api-access-rkgjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.325767 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgjm\" (UniqueName: \"kubernetes.io/projected/a9acbc58-725d-49da-8110-54a476725dbc-kube-api-access-rkgjm\") on node \"crc\" DevicePath \"\"" Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.885097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" event={"ID":"a9acbc58-725d-49da-8110-54a476725dbc","Type":"ContainerDied","Data":"eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5"} Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.885142 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eef47a2674498a6afdf728aee52c6b47d24e3da26bbc035c22b09f48b9dd55f5" Mar 11 01:04:05 crc kubenswrapper[4744]: I0311 01:04:05.885170 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553184-gvb9b" Mar 11 01:04:06 crc kubenswrapper[4744]: I0311 01:04:06.174387 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553178-56m2v"] Mar 11 01:04:06 crc kubenswrapper[4744]: I0311 01:04:06.180876 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553178-56m2v"] Mar 11 01:04:07 crc kubenswrapper[4744]: I0311 01:04:07.986747 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95897da0-81a7-4656-9787-808f64d7aa9d" path="/var/lib/kubelet/pods/95897da0-81a7-4656-9787-808f64d7aa9d/volumes" Mar 11 01:04:12 crc kubenswrapper[4744]: I0311 01:04:12.409899 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:04:12 crc kubenswrapper[4744]: I0311 01:04:12.410677 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:04:42 crc kubenswrapper[4744]: I0311 01:04:42.409138 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:04:42 crc kubenswrapper[4744]: I0311 01:04:42.409914 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:05:04 crc kubenswrapper[4744]: I0311 01:05:04.295082 4744 scope.go:117] "RemoveContainer" containerID="f4500f053249a9107662e0962cc87e100738c7ccadb7e113b4acb2d44153988f" Mar 11 01:05:04 crc kubenswrapper[4744]: I0311 01:05:04.346292 4744 scope.go:117] "RemoveContainer" containerID="f345e8a8f84ce6cefafb28ca43e1fdf81eff4b948e882887f03dc2198bb68c8d" Mar 11 01:05:12 crc kubenswrapper[4744]: I0311 01:05:12.409477 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:05:12 crc kubenswrapper[4744]: I0311 01:05:12.409924 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:05:12 crc kubenswrapper[4744]: I0311 01:05:12.410019 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:05:12 crc kubenswrapper[4744]: I0311 01:05:12.411441 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:05:12 crc kubenswrapper[4744]: I0311 01:05:12.411636 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80" gracePeriod=600 Mar 11 01:05:13 crc kubenswrapper[4744]: I0311 01:05:13.380857 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80" exitCode=0 Mar 11 01:05:13 crc kubenswrapper[4744]: I0311 01:05:13.380949 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80"} Mar 11 01:05:13 crc kubenswrapper[4744]: I0311 01:05:13.381487 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971"} Mar 11 01:05:13 crc kubenswrapper[4744]: I0311 01:05:13.381549 4744 scope.go:117] "RemoveContainer" containerID="5ee5d96431a32414b69d9b9b50318c6a1ccb5a06fc8087509e71724d04b86732" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.152049 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553186-q5zpz"] Mar 11 01:06:00 crc kubenswrapper[4744]: E0311 01:06:00.154064 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9acbc58-725d-49da-8110-54a476725dbc" containerName="oc" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.154106 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9acbc58-725d-49da-8110-54a476725dbc" containerName="oc" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.154294 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9acbc58-725d-49da-8110-54a476725dbc" containerName="oc" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.154950 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.158478 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.158564 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.162235 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.164583 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553186-q5zpz"] Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.177592 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsq9m\" (UniqueName: \"kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m\") pod \"auto-csr-approver-29553186-q5zpz\" (UID: \"f0d8650d-fb33-4ea1-8433-cd108c110664\") " pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.279060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsq9m\" (UniqueName: \"kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m\") pod \"auto-csr-approver-29553186-q5zpz\" (UID: \"f0d8650d-fb33-4ea1-8433-cd108c110664\") " pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.321896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsq9m\" (UniqueName: \"kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m\") pod \"auto-csr-approver-29553186-q5zpz\" (UID: \"f0d8650d-fb33-4ea1-8433-cd108c110664\") " pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.501717 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:00 crc kubenswrapper[4744]: I0311 01:06:00.752364 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553186-q5zpz"] Mar 11 01:06:00 crc kubenswrapper[4744]: W0311 01:06:00.760063 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0d8650d_fb33_4ea1_8433_cd108c110664.slice/crio-1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2 WatchSource:0}: Error finding container 1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2: Status 404 returned error can't find the container with id 1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2 Mar 11 01:06:01 crc kubenswrapper[4744]: I0311 01:06:01.093617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" event={"ID":"f0d8650d-fb33-4ea1-8433-cd108c110664","Type":"ContainerStarted","Data":"1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2"} Mar 11 01:06:02 crc kubenswrapper[4744]: I0311 01:06:02.106118 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" event={"ID":"f0d8650d-fb33-4ea1-8433-cd108c110664","Type":"ContainerStarted","Data":"b1d97507e646888ad42be397ac17323b3164949551d9beb6c43aab54c70713f0"} Mar 11 01:06:02 crc kubenswrapper[4744]: I0311 01:06:02.132994 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" podStartSLOduration=1.199052405 podStartE2EDuration="2.132970122s" podCreationTimestamp="2026-03-11 01:06:00 +0000 UTC" firstStartedPulling="2026-03-11 01:06:00.763503141 +0000 UTC m=+717.567720756" lastFinishedPulling="2026-03-11 01:06:01.697420838 +0000 UTC m=+718.501638473" observedRunningTime="2026-03-11 01:06:02.12725036 +0000 UTC m=+718.931468045" watchObservedRunningTime="2026-03-11 01:06:02.132970122 +0000 UTC m=+718.937187757" Mar 11 01:06:03 crc kubenswrapper[4744]: I0311 01:06:03.116957 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0d8650d-fb33-4ea1-8433-cd108c110664" containerID="b1d97507e646888ad42be397ac17323b3164949551d9beb6c43aab54c70713f0" exitCode=0 Mar 11 01:06:03 crc kubenswrapper[4744]: I0311 01:06:03.117028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" event={"ID":"f0d8650d-fb33-4ea1-8433-cd108c110664","Type":"ContainerDied","Data":"b1d97507e646888ad42be397ac17323b3164949551d9beb6c43aab54c70713f0"} Mar 11 01:06:04 crc kubenswrapper[4744]: I0311 01:06:04.426746 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:04 crc kubenswrapper[4744]: I0311 01:06:04.531479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsq9m\" (UniqueName: \"kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m\") pod \"f0d8650d-fb33-4ea1-8433-cd108c110664\" (UID: \"f0d8650d-fb33-4ea1-8433-cd108c110664\") " Mar 11 01:06:04 crc kubenswrapper[4744]: I0311 01:06:04.539368 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m" (OuterVolumeSpecName: "kube-api-access-zsq9m") pod "f0d8650d-fb33-4ea1-8433-cd108c110664" (UID: "f0d8650d-fb33-4ea1-8433-cd108c110664"). InnerVolumeSpecName "kube-api-access-zsq9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:06:04 crc kubenswrapper[4744]: I0311 01:06:04.632618 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsq9m\" (UniqueName: \"kubernetes.io/projected/f0d8650d-fb33-4ea1-8433-cd108c110664-kube-api-access-zsq9m\") on node \"crc\" DevicePath \"\"" Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.134867 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" event={"ID":"f0d8650d-fb33-4ea1-8433-cd108c110664","Type":"ContainerDied","Data":"1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2"} Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.135229 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d2bb47e49b11079606ed936b88d83caa841eca0e338c62262603b73b3c797d2" Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.134936 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553186-q5zpz" Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.193271 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553180-89p78"] Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.199597 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553180-89p78"] Mar 11 01:06:05 crc kubenswrapper[4744]: I0311 01:06:05.986570 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d246e126-a4f4-41d4-83b1-115cb6f674ec" path="/var/lib/kubelet/pods/d246e126-a4f4-41d4-83b1-115cb6f674ec/volumes" Mar 11 01:07:04 crc kubenswrapper[4744]: I0311 01:07:04.451107 4744 scope.go:117] "RemoveContainer" containerID="3741a4b098ab9f06f8a342cf1d6e93b93d68d1bc46c5ebd60097f1532fc283bd" Mar 11 01:07:12 crc kubenswrapper[4744]: I0311 01:07:12.409376 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:07:12 crc kubenswrapper[4744]: I0311 01:07:12.410018 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:07:42 crc kubenswrapper[4744]: I0311 01:07:42.409458 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:07:42 crc kubenswrapper[4744]: I0311 01:07:42.411550 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.149216 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553188-qb5tx"] Mar 11 01:08:00 crc kubenswrapper[4744]: E0311 01:08:00.150160 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d8650d-fb33-4ea1-8433-cd108c110664" containerName="oc" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.150180 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d8650d-fb33-4ea1-8433-cd108c110664" containerName="oc" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.150353 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d8650d-fb33-4ea1-8433-cd108c110664" containerName="oc" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.150968 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.155087 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.155724 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.156210 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.166665 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553188-qb5tx"] Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.243172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97nd\" (UniqueName: \"kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd\") pod \"auto-csr-approver-29553188-qb5tx\" (UID: \"682147c3-ee15-4c94-801a-d40279ffbb5b\") " pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.344682 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97nd\" (UniqueName: \"kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd\") pod \"auto-csr-approver-29553188-qb5tx\" (UID: \"682147c3-ee15-4c94-801a-d40279ffbb5b\") " pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.381881 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g97nd\" (UniqueName: \"kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd\") pod \"auto-csr-approver-29553188-qb5tx\" (UID: \"682147c3-ee15-4c94-801a-d40279ffbb5b\") " pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.479763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.905638 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553188-qb5tx"] Mar 11 01:08:00 crc kubenswrapper[4744]: I0311 01:08:00.942192 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" event={"ID":"682147c3-ee15-4c94-801a-d40279ffbb5b","Type":"ContainerStarted","Data":"09b3440bc406df61a2c0bd7f40108acb75b82e60c5b1c50a62201b4e758d0061"} Mar 11 01:08:02 crc kubenswrapper[4744]: I0311 01:08:02.956842 4744 generic.go:334] "Generic (PLEG): container finished" podID="682147c3-ee15-4c94-801a-d40279ffbb5b" containerID="70710e4f74cdd2d5659fed13d5c78e26a14bc6f4243e778718bbd292280e80a2" exitCode=0 Mar 11 01:08:02 crc kubenswrapper[4744]: I0311 01:08:02.956961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" event={"ID":"682147c3-ee15-4c94-801a-d40279ffbb5b","Type":"ContainerDied","Data":"70710e4f74cdd2d5659fed13d5c78e26a14bc6f4243e778718bbd292280e80a2"} Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.277169 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.297735 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g97nd\" (UniqueName: \"kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd\") pod \"682147c3-ee15-4c94-801a-d40279ffbb5b\" (UID: \"682147c3-ee15-4c94-801a-d40279ffbb5b\") " Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.303818 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd" (OuterVolumeSpecName: "kube-api-access-g97nd") pod "682147c3-ee15-4c94-801a-d40279ffbb5b" (UID: "682147c3-ee15-4c94-801a-d40279ffbb5b"). InnerVolumeSpecName "kube-api-access-g97nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.398811 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g97nd\" (UniqueName: \"kubernetes.io/projected/682147c3-ee15-4c94-801a-d40279ffbb5b-kube-api-access-g97nd\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.975762 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" event={"ID":"682147c3-ee15-4c94-801a-d40279ffbb5b","Type":"ContainerDied","Data":"09b3440bc406df61a2c0bd7f40108acb75b82e60c5b1c50a62201b4e758d0061"} Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.976165 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09b3440bc406df61a2c0bd7f40108acb75b82e60c5b1c50a62201b4e758d0061" Mar 11 01:08:04 crc kubenswrapper[4744]: I0311 01:08:04.975827 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553188-qb5tx" Mar 11 01:08:05 crc kubenswrapper[4744]: I0311 01:08:05.346422 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553182-pvwjp"] Mar 11 01:08:05 crc kubenswrapper[4744]: I0311 01:08:05.353602 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553182-pvwjp"] Mar 11 01:08:05 crc kubenswrapper[4744]: I0311 01:08:05.984077 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8c60ea-f39f-4095-b939-f2723480055f" path="/var/lib/kubelet/pods/ca8c60ea-f39f-4095-b939-f2723480055f/volumes" Mar 11 01:08:12 crc kubenswrapper[4744]: I0311 01:08:12.409885 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:08:12 crc kubenswrapper[4744]: I0311 01:08:12.410290 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:08:12 crc kubenswrapper[4744]: I0311 01:08:12.410354 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:08:12 crc kubenswrapper[4744]: I0311 01:08:12.411120 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:08:12 crc kubenswrapper[4744]: I0311 01:08:12.411210 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971" gracePeriod=600 Mar 11 01:08:13 crc kubenswrapper[4744]: I0311 01:08:13.030691 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971" exitCode=0 Mar 11 01:08:13 crc kubenswrapper[4744]: I0311 01:08:13.030777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971"} Mar 11 01:08:13 crc kubenswrapper[4744]: I0311 01:08:13.031886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da"} Mar 11 01:08:13 crc kubenswrapper[4744]: I0311 01:08:13.032002 4744 scope.go:117] "RemoveContainer" containerID="691f1b56c7a52ada877a74c90e388d4d56b2f8e2efd59d3ebdd8fb4d55d33c80" Mar 11 01:08:20 crc kubenswrapper[4744]: I0311 01:08:20.238586 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 01:08:28 crc kubenswrapper[4744]: I0311 01:08:28.881866 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:28 crc kubenswrapper[4744]: E0311 01:08:28.882672 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682147c3-ee15-4c94-801a-d40279ffbb5b" containerName="oc" Mar 11 01:08:28 crc kubenswrapper[4744]: I0311 01:08:28.882692 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="682147c3-ee15-4c94-801a-d40279ffbb5b" containerName="oc" Mar 11 01:08:28 crc kubenswrapper[4744]: I0311 01:08:28.882865 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="682147c3-ee15-4c94-801a-d40279ffbb5b" containerName="oc" Mar 11 01:08:28 crc kubenswrapper[4744]: I0311 01:08:28.884275 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:28 crc kubenswrapper[4744]: I0311 01:08:28.908293 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.040206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.040271 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkbmp\" (UniqueName: \"kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.040475 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.141844 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.141943 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkbmp\" (UniqueName: \"kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.142033 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.143646 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.143838 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.179366 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkbmp\" (UniqueName: \"kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp\") pod \"certified-operators-qg8qn\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.209877 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:29 crc kubenswrapper[4744]: I0311 01:08:29.442518 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:30 crc kubenswrapper[4744]: I0311 01:08:30.174722 4744 generic.go:334] "Generic (PLEG): container finished" podID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerID="6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd" exitCode=0 Mar 11 01:08:30 crc kubenswrapper[4744]: I0311 01:08:30.174792 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerDied","Data":"6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd"} Mar 11 01:08:30 crc kubenswrapper[4744]: I0311 01:08:30.175042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerStarted","Data":"b79a5da1e46821fdcb78b60f85473ceb7ad483808c90964505ba3fe7a36b12f7"} Mar 11 01:08:31 crc kubenswrapper[4744]: I0311 01:08:31.186467 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerStarted","Data":"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d"} Mar 11 01:08:32 crc kubenswrapper[4744]: I0311 01:08:32.196455 4744 generic.go:334] "Generic (PLEG): container finished" podID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerID="3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d" exitCode=0 Mar 11 01:08:32 crc kubenswrapper[4744]: I0311 01:08:32.196575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerDied","Data":"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d"} Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.183139 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-78fcc"] Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184300 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-controller" containerID="cri-o://27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184413 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="sbdb" containerID="cri-o://742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184569 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184657 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="nbdb" containerID="cri-o://98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184605 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-acl-logging" containerID="cri-o://84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184569 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-node" containerID="cri-o://8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.184613 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="northd" containerID="cri-o://d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.218879 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerStarted","Data":"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5"} Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.239305 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" containerID="cri-o://b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" gracePeriod=30 Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.260594 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qg8qn" podStartSLOduration=2.7217426319999998 podStartE2EDuration="5.260560586s" podCreationTimestamp="2026-03-11 01:08:28 +0000 UTC" firstStartedPulling="2026-03-11 01:08:30.177032926 +0000 UTC m=+866.981250571" lastFinishedPulling="2026-03-11 01:08:32.71585088 +0000 UTC m=+869.520068525" observedRunningTime="2026-03-11 01:08:33.252706874 +0000 UTC m=+870.056924479" watchObservedRunningTime="2026-03-11 01:08:33.260560586 +0000 UTC m=+870.064778191" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.496617 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/3.log" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.498837 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovn-acl-logging/0.log" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.499208 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovn-controller/0.log" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.499747 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.557872 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wk26z"] Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558177 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558204 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558223 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-acl-logging" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558234 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-acl-logging" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558257 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="sbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558269 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="sbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558283 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558295 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558312 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558325 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558338 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558349 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558368 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558379 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558391 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="northd" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558402 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="northd" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558427 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kubecfg-setup" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558438 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kubecfg-setup" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558449 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558460 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558478 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-node" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558488 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-node" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.558503 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="nbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558539 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="nbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558712 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558733 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558748 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558765 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558776 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558792 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovn-acl-logging" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558802 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-node" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558815 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="sbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558825 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="nbdb" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558845 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.558857 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="northd" Mar 11 01:08:33 crc kubenswrapper[4744]: E0311 01:08:33.559035 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.559051 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.559216 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff04e11-e747-44c5-b049-371a5d422157" containerName="ovnkube-controller" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.561709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599273 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599313 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599329 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599356 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599412 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash" (OuterVolumeSpecName: "host-slash") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599421 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log" (OuterVolumeSpecName: "node-log") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599429 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599480 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599531 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599571 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599653 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599705 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599736 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr9zj\" (UniqueName: \"kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599764 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599789 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599821 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599825 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599888 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599853 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599848 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599873 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599919 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599917 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599959 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599958 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket" (OuterVolumeSpecName: "log-socket") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599979 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.599998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6ff04e11-e747-44c5-b049-371a5d422157\" (UID: \"6ff04e11-e747-44c5-b049-371a5d422157\") " Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600105 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600134 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600308 4744 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600324 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600336 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600348 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600359 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600374 4744 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600384 4744 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600396 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600407 4744 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-log-socket\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600421 4744 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600434 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600447 4744 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-node-log\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600458 4744 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-slash\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600567 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.600841 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.604721 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.605492 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj" (OuterVolumeSpecName: "kube-api-access-fr9zj") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "kube-api-access-fr9zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.612059 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6ff04e11-e747-44c5-b049-371a5d422157" (UID: "6ff04e11-e747-44c5-b049-371a5d422157"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.701878 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-var-lib-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.701948 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.701981 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-script-lib\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.701998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-slash\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702016 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm8p6\" (UniqueName: \"kubernetes.io/projected/b88aed14-45de-493c-9cb1-e463c398aa02-kube-api-access-zm8p6\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702041 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-log-socket\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702120 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-systemd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-bin\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-etc-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-systemd-units\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-kubelet\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702245 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-env-overrides\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702304 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b88aed14-45de-493c-9cb1-e463c398aa02-ovn-node-metrics-cert\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-ovn\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702407 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-netns\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702438 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-netd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-config\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702504 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702544 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-node-log\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702594 4744 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702609 4744 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702621 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ff04e11-e747-44c5-b049-371a5d422157-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702634 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ff04e11-e747-44c5-b049-371a5d422157-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702646 4744 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702660 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ff04e11-e747-44c5-b049-371a5d422157-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.702672 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr9zj\" (UniqueName: \"kubernetes.io/projected/6ff04e11-e747-44c5-b049-371a5d422157-kube-api-access-fr9zj\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804007 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-env-overrides\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b88aed14-45de-493c-9cb1-e463c398aa02-ovn-node-metrics-cert\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-ovn\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-netns\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804118 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-netd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804145 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-config\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804179 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-node-log\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-var-lib-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804216 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-ovn\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804257 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-netd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804234 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-script-lib\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804348 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-slash\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm8p6\" (UniqueName: \"kubernetes.io/projected/b88aed14-45de-493c-9cb1-e463c398aa02-kube-api-access-zm8p6\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804414 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-node-log\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804439 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-log-socket\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-systemd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-bin\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804590 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-etc-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-systemd-units\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-kubelet\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804778 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-kubelet\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804825 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-netns\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-run-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.804908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-slash\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-var-lib-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-cni-bin\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805385 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-etc-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805406 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-systemd-units\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805430 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-log-socket\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805413 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-env-overrides\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805465 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-openvswitch\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805482 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-run-systemd\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b88aed14-45de-493c-9cb1-e463c398aa02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.805612 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-config\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.806072 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b88aed14-45de-493c-9cb1-e463c398aa02-ovnkube-script-lib\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.814104 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b88aed14-45de-493c-9cb1-e463c398aa02-ovn-node-metrics-cert\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.826663 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm8p6\" (UniqueName: \"kubernetes.io/projected/b88aed14-45de-493c-9cb1-e463c398aa02-kube-api-access-zm8p6\") pod \"ovnkube-node-wk26z\" (UID: \"b88aed14-45de-493c-9cb1-e463c398aa02\") " pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: I0311 01:08:33.881016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:33 crc kubenswrapper[4744]: W0311 01:08:33.922738 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88aed14_45de_493c_9cb1_e463c398aa02.slice/crio-03ae5d8898256acce75553e3719a3fa7f6391d99e165e98e7c2bbcf870b85da4 WatchSource:0}: Error finding container 03ae5d8898256acce75553e3719a3fa7f6391d99e165e98e7c2bbcf870b85da4: Status 404 returned error can't find the container with id 03ae5d8898256acce75553e3719a3fa7f6391d99e165e98e7c2bbcf870b85da4 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.225699 4744 generic.go:334] "Generic (PLEG): container finished" podID="b88aed14-45de-493c-9cb1-e463c398aa02" containerID="db505d689f3aef78c49e4bac171af93778a97b6d98d194870b1a8334333b3dc9" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.225837 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerDied","Data":"db505d689f3aef78c49e4bac171af93778a97b6d98d194870b1a8334333b3dc9"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.225881 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"03ae5d8898256acce75553e3719a3fa7f6391d99e165e98e7c2bbcf870b85da4"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.232401 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/2.log" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.232748 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/1.log" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.232781 4744 generic.go:334] "Generic (PLEG): container finished" podID="e16bf0f3-533b-4114-89c6-195a85273e98" containerID="7af721d68eedfb76de378529ad9a2fb23d33e7a1d6d37b9abb8763fe0d9087f1" exitCode=2 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.232826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerDied","Data":"7af721d68eedfb76de378529ad9a2fb23d33e7a1d6d37b9abb8763fe0d9087f1"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.232857 4744 scope.go:117] "RemoveContainer" containerID="0289bdbe344d516cb0576b0b739e5752ec2f754bd78918b1d81f20d880ebd1f5" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.233251 4744 scope.go:117] "RemoveContainer" containerID="7af721d68eedfb76de378529ad9a2fb23d33e7a1d6d37b9abb8763fe0d9087f1" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.236217 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovnkube-controller/3.log" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.238407 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovn-acl-logging/0.log" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239069 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78fcc_6ff04e11-e747-44c5-b049-371a5d422157/ovn-controller/0.log" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239434 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239466 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239477 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239487 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239497 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239506 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" exitCode=0 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239530 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" exitCode=143 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239539 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ff04e11-e747-44c5-b049-371a5d422157" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" exitCode=143 Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239505 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239566 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239601 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239611 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239623 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239633 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239645 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239651 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239657 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239662 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239667 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239672 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239677 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239682 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239686 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239700 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239705 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239711 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239716 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239721 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239725 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239730 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239735 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239739 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239744 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239752 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239760 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239767 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239772 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239777 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239782 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239787 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239792 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239797 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239802 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239807 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78fcc" event={"ID":"6ff04e11-e747-44c5-b049-371a5d422157","Type":"ContainerDied","Data":"149bb7f8bc16884920b6e48e26185cd3ea4656a535d32aea578edd5e1ba9b507"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239821 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239826 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239832 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239837 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239842 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239848 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239854 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239859 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239864 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.239869 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.281112 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-78fcc"] Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.298077 4744 scope.go:117] "RemoveContainer" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.305412 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-78fcc"] Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.328624 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.384652 4744 scope.go:117] "RemoveContainer" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.428229 4744 scope.go:117] "RemoveContainer" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.443644 4744 scope.go:117] "RemoveContainer" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.454135 4744 scope.go:117] "RemoveContainer" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.476679 4744 scope.go:117] "RemoveContainer" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.492745 4744 scope.go:117] "RemoveContainer" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.509326 4744 scope.go:117] "RemoveContainer" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.531456 4744 scope.go:117] "RemoveContainer" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.552190 4744 scope.go:117] "RemoveContainer" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.552498 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": container with ID starting with b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce not found: ID does not exist" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.552535 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} err="failed to get container status \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": rpc error: code = NotFound desc = could not find container \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": container with ID starting with b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.552556 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.552762 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": container with ID starting with 4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0 not found: ID does not exist" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.552785 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} err="failed to get container status \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": rpc error: code = NotFound desc = could not find container \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": container with ID starting with 4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.552798 4744 scope.go:117] "RemoveContainer" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.552996 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": container with ID starting with 742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139 not found: ID does not exist" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553023 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} err="failed to get container status \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": rpc error: code = NotFound desc = could not find container \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": container with ID starting with 742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553036 4744 scope.go:117] "RemoveContainer" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.553496 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": container with ID starting with 98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73 not found: ID does not exist" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553552 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} err="failed to get container status \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": rpc error: code = NotFound desc = could not find container \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": container with ID starting with 98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553566 4744 scope.go:117] "RemoveContainer" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.553862 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": container with ID starting with d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe not found: ID does not exist" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553882 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} err="failed to get container status \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": rpc error: code = NotFound desc = could not find container \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": container with ID starting with d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.553894 4744 scope.go:117] "RemoveContainer" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.554075 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": container with ID starting with e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774 not found: ID does not exist" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554094 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} err="failed to get container status \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": rpc error: code = NotFound desc = could not find container \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": container with ID starting with e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554106 4744 scope.go:117] "RemoveContainer" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.554334 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": container with ID starting with 8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080 not found: ID does not exist" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554354 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} err="failed to get container status \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": rpc error: code = NotFound desc = could not find container \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": container with ID starting with 8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554366 4744 scope.go:117] "RemoveContainer" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.554581 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": container with ID starting with 84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76 not found: ID does not exist" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554600 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} err="failed to get container status \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": rpc error: code = NotFound desc = could not find container \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": container with ID starting with 84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554612 4744 scope.go:117] "RemoveContainer" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.554928 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": container with ID starting with 27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8 not found: ID does not exist" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554947 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} err="failed to get container status \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": rpc error: code = NotFound desc = could not find container \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": container with ID starting with 27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.554961 4744 scope.go:117] "RemoveContainer" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: E0311 01:08:34.555133 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": container with ID starting with 524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481 not found: ID does not exist" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555151 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} err="failed to get container status \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": rpc error: code = NotFound desc = could not find container \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": container with ID starting with 524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555170 4744 scope.go:117] "RemoveContainer" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555334 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} err="failed to get container status \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": rpc error: code = NotFound desc = could not find container \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": container with ID starting with b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555351 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555753 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} err="failed to get container status \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": rpc error: code = NotFound desc = could not find container \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": container with ID starting with 4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.555772 4744 scope.go:117] "RemoveContainer" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.556946 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} err="failed to get container status \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": rpc error: code = NotFound desc = could not find container \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": container with ID starting with 742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.556993 4744 scope.go:117] "RemoveContainer" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.557626 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} err="failed to get container status \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": rpc error: code = NotFound desc = could not find container \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": container with ID starting with 98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.557675 4744 scope.go:117] "RemoveContainer" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.557987 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} err="failed to get container status \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": rpc error: code = NotFound desc = could not find container \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": container with ID starting with d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.558021 4744 scope.go:117] "RemoveContainer" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.558546 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} err="failed to get container status \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": rpc error: code = NotFound desc = could not find container \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": container with ID starting with e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.558564 4744 scope.go:117] "RemoveContainer" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.558834 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} err="failed to get container status \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": rpc error: code = NotFound desc = could not find container \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": container with ID starting with 8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.558850 4744 scope.go:117] "RemoveContainer" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.559626 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} err="failed to get container status \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": rpc error: code = NotFound desc = could not find container \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": container with ID starting with 84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.559643 4744 scope.go:117] "RemoveContainer" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.559826 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} err="failed to get container status \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": rpc error: code = NotFound desc = could not find container \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": container with ID starting with 27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.559842 4744 scope.go:117] "RemoveContainer" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560002 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} err="failed to get container status \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": rpc error: code = NotFound desc = could not find container \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": container with ID starting with 524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560019 4744 scope.go:117] "RemoveContainer" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560200 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} err="failed to get container status \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": rpc error: code = NotFound desc = could not find container \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": container with ID starting with b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560233 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560410 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} err="failed to get container status \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": rpc error: code = NotFound desc = could not find container \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": container with ID starting with 4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560426 4744 scope.go:117] "RemoveContainer" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560644 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} err="failed to get container status \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": rpc error: code = NotFound desc = could not find container \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": container with ID starting with 742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560659 4744 scope.go:117] "RemoveContainer" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560874 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} err="failed to get container status \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": rpc error: code = NotFound desc = could not find container \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": container with ID starting with 98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.560889 4744 scope.go:117] "RemoveContainer" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.561217 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} err="failed to get container status \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": rpc error: code = NotFound desc = could not find container \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": container with ID starting with d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.561253 4744 scope.go:117] "RemoveContainer" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.561441 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} err="failed to get container status \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": rpc error: code = NotFound desc = could not find container \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": container with ID starting with e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.561456 4744 scope.go:117] "RemoveContainer" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.561819 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} err="failed to get container status \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": rpc error: code = NotFound desc = could not find container \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": container with ID starting with 8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.562068 4744 scope.go:117] "RemoveContainer" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.562502 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} err="failed to get container status \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": rpc error: code = NotFound desc = could not find container \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": container with ID starting with 84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.562579 4744 scope.go:117] "RemoveContainer" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.562856 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} err="failed to get container status \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": rpc error: code = NotFound desc = could not find container \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": container with ID starting with 27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.562879 4744 scope.go:117] "RemoveContainer" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563140 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} err="failed to get container status \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": rpc error: code = NotFound desc = could not find container \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": container with ID starting with 524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563161 4744 scope.go:117] "RemoveContainer" containerID="b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563419 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce"} err="failed to get container status \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": rpc error: code = NotFound desc = could not find container \"b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce\": container with ID starting with b54b226b6e5d010f5666be32c5e89383a99ed40a1a63d82cfb4e37b9337437ce not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563456 4744 scope.go:117] "RemoveContainer" containerID="4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563697 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0"} err="failed to get container status \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": rpc error: code = NotFound desc = could not find container \"4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0\": container with ID starting with 4df93d4dc2ebaa7aed01d39e44292956a190da5fd454d61f8e46a33114a1ccd0 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.563716 4744 scope.go:117] "RemoveContainer" containerID="742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564050 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139"} err="failed to get container status \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": rpc error: code = NotFound desc = could not find container \"742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139\": container with ID starting with 742e3b6ac0accae72e9285636ab0920db1d3e0625d6093adf1606f57b484b139 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564089 4744 scope.go:117] "RemoveContainer" containerID="98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564336 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73"} err="failed to get container status \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": rpc error: code = NotFound desc = could not find container \"98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73\": container with ID starting with 98ba07651ae84a42f92411fee85a15562e325c61ff2d8fcba4962d21c51fce73 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564358 4744 scope.go:117] "RemoveContainer" containerID="d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564577 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe"} err="failed to get container status \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": rpc error: code = NotFound desc = could not find container \"d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe\": container with ID starting with d7cd308cadb08ea56bbc709c6401ab1d3eec24f0414a7ee1c57188c5e60adffe not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564595 4744 scope.go:117] "RemoveContainer" containerID="e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564812 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774"} err="failed to get container status \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": rpc error: code = NotFound desc = could not find container \"e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774\": container with ID starting with e37e770ef43eabf7e0bb162d7e700eaa2d61c7ceca9737de5872c1caeec06774 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.564828 4744 scope.go:117] "RemoveContainer" containerID="8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565064 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080"} err="failed to get container status \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": rpc error: code = NotFound desc = could not find container \"8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080\": container with ID starting with 8132bd89cd281c66b2735b3fa85c3fc773e7140d6aaeaefe0c0a2c3e3f743080 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565091 4744 scope.go:117] "RemoveContainer" containerID="84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565346 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76"} err="failed to get container status \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": rpc error: code = NotFound desc = could not find container \"84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76\": container with ID starting with 84d4ada431e7a91cc86b5cb9bba4316c233c5f666f21bce1d8b6df0019b6ad76 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565374 4744 scope.go:117] "RemoveContainer" containerID="27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565606 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8"} err="failed to get container status \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": rpc error: code = NotFound desc = could not find container \"27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8\": container with ID starting with 27bc44383b34aea31ffcebae0f1f464c31918edc0c37f01d6bd66166e34fd4a8 not found: ID does not exist" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565624 4744 scope.go:117] "RemoveContainer" containerID="524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481" Mar 11 01:08:34 crc kubenswrapper[4744]: I0311 01:08:34.565852 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481"} err="failed to get container status \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": rpc error: code = NotFound desc = could not find container \"524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481\": container with ID starting with 524c79ee3bb53b187166cbc6bf19cbb3311ab561608a1d3c819288c1fb684481 not found: ID does not exist" Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.250676 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"44a4ba8c4044a897b65ddaacfe2b0c38e2ef692355ddfb0f458b86ed0efee434"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.251002 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"3f7bf9d7cb91bcf40a6eebe62dbcdbaf5e1d2dd1370e5fc18bb85a70c62ce929"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.251017 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"a8098aebdb8e25f19c19864c3b502d4efc49b1740a0dcae29fc61d4d80fe29d8"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.251030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"a52e680a6af6accacc712eb2c45cb67fb30c73d8cb538337f355fcd0e9706a61"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.251042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"f4d9af067789581ec770978d0be87ddd9e4beda2041243ab397f1aceef680995"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.251053 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"47173354a6e9513540e3e75bf257238f7f41fe49e90e218df6b02d59106522d5"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.253028 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xlclh_e16bf0f3-533b-4114-89c6-195a85273e98/kube-multus/2.log" Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.253092 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xlclh" event={"ID":"e16bf0f3-533b-4114-89c6-195a85273e98","Type":"ContainerStarted","Data":"6d89427b5e096766a7b8c0e07010f8f279896267301594bd140f3cd31f5d8ac2"} Mar 11 01:08:35 crc kubenswrapper[4744]: I0311 01:08:35.988156 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff04e11-e747-44c5-b049-371a5d422157" path="/var/lib/kubelet/pods/6ff04e11-e747-44c5-b049-371a5d422157/volumes" Mar 11 01:08:37 crc kubenswrapper[4744]: I0311 01:08:37.276377 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"a20e89aecd3a8215408971b268074b22a4d88cd8101d807adb12a0e785c3751b"} Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.700862 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lkdth"] Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.702310 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.704811 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.705075 4744 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9g4m8" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.705555 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.706945 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.876078 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.876509 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc28n\" (UniqueName: \"kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.876735 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.978063 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.978161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc28n\" (UniqueName: \"kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.978211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.978679 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:38 crc kubenswrapper[4744]: I0311 01:08:38.979472 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.012024 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc28n\" (UniqueName: \"kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n\") pod \"crc-storage-crc-lkdth\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.024734 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:39 crc kubenswrapper[4744]: E0311 01:08:39.072716 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(f6adbffa19d081286f080caed070693c765b3f1cd5445646186cbc01504a8663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 01:08:39 crc kubenswrapper[4744]: E0311 01:08:39.072801 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(f6adbffa19d081286f080caed070693c765b3f1cd5445646186cbc01504a8663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:39 crc kubenswrapper[4744]: E0311 01:08:39.072836 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(f6adbffa19d081286f080caed070693c765b3f1cd5445646186cbc01504a8663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:39 crc kubenswrapper[4744]: E0311 01:08:39.072902 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-lkdth_crc-storage(9829f27b-c482-450d-8e09-231b8b9943bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-lkdth_crc-storage(9829f27b-c482-450d-8e09-231b8b9943bc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(f6adbffa19d081286f080caed070693c765b3f1cd5445646186cbc01504a8663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-lkdth" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.210253 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.210349 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.269244 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.342453 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:39 crc kubenswrapper[4744]: I0311 01:08:39.516549 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:40 crc kubenswrapper[4744]: I0311 01:08:40.309491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" event={"ID":"b88aed14-45de-493c-9cb1-e463c398aa02","Type":"ContainerStarted","Data":"002c23562921dc84cf101c5662192abe6ed0ad559e9d671c799604e888826964"} Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.317349 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qg8qn" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="registry-server" containerID="cri-o://6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5" gracePeriod=2 Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.317755 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.318009 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.318023 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.346069 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.362289 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.373401 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" podStartSLOduration=8.37338551 podStartE2EDuration="8.37338551s" podCreationTimestamp="2026-03-11 01:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:08:40.346174624 +0000 UTC m=+877.150392279" watchObservedRunningTime="2026-03-11 01:08:41.37338551 +0000 UTC m=+878.177603115" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.489770 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.617332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkbmp\" (UniqueName: \"kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp\") pod \"1879ad6e-72b6-4c58-a771-264acbcdcb34\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.617576 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities\") pod \"1879ad6e-72b6-4c58-a771-264acbcdcb34\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.617635 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content\") pod \"1879ad6e-72b6-4c58-a771-264acbcdcb34\" (UID: \"1879ad6e-72b6-4c58-a771-264acbcdcb34\") " Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.618625 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities" (OuterVolumeSpecName: "utilities") pod "1879ad6e-72b6-4c58-a771-264acbcdcb34" (UID: "1879ad6e-72b6-4c58-a771-264acbcdcb34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.638827 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp" (OuterVolumeSpecName: "kube-api-access-zkbmp") pod "1879ad6e-72b6-4c58-a771-264acbcdcb34" (UID: "1879ad6e-72b6-4c58-a771-264acbcdcb34"). InnerVolumeSpecName "kube-api-access-zkbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.643715 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lkdth"] Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.643843 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.644356 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:41 crc kubenswrapper[4744]: E0311 01:08:41.673182 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(731753e1dfaed9220c8eac32f870bb12976980a5e752b943c4f8d17dbb0e732d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 01:08:41 crc kubenswrapper[4744]: E0311 01:08:41.673263 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(731753e1dfaed9220c8eac32f870bb12976980a5e752b943c4f8d17dbb0e732d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:41 crc kubenswrapper[4744]: E0311 01:08:41.673291 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(731753e1dfaed9220c8eac32f870bb12976980a5e752b943c4f8d17dbb0e732d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:41 crc kubenswrapper[4744]: E0311 01:08:41.673348 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-lkdth_crc-storage(9829f27b-c482-450d-8e09-231b8b9943bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-lkdth_crc-storage(9829f27b-c482-450d-8e09-231b8b9943bc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-lkdth_crc-storage_9829f27b-c482-450d-8e09-231b8b9943bc_0(731753e1dfaed9220c8eac32f870bb12976980a5e752b943c4f8d17dbb0e732d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-lkdth" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.718825 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkbmp\" (UniqueName: \"kubernetes.io/projected/1879ad6e-72b6-4c58-a771-264acbcdcb34-kube-api-access-zkbmp\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:41 crc kubenswrapper[4744]: I0311 01:08:41.718855 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.113008 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1879ad6e-72b6-4c58-a771-264acbcdcb34" (UID: "1879ad6e-72b6-4c58-a771-264acbcdcb34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.124213 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1879ad6e-72b6-4c58-a771-264acbcdcb34-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.327016 4744 generic.go:334] "Generic (PLEG): container finished" podID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerID="6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5" exitCode=0 Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.327120 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg8qn" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.327114 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerDied","Data":"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5"} Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.327201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg8qn" event={"ID":"1879ad6e-72b6-4c58-a771-264acbcdcb34","Type":"ContainerDied","Data":"b79a5da1e46821fdcb78b60f85473ceb7ad483808c90964505ba3fe7a36b12f7"} Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.327239 4744 scope.go:117] "RemoveContainer" containerID="6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.357479 4744 scope.go:117] "RemoveContainer" containerID="3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.365001 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.374476 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qg8qn"] Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.384023 4744 scope.go:117] "RemoveContainer" containerID="6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.404183 4744 scope.go:117] "RemoveContainer" containerID="6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5" Mar 11 01:08:42 crc kubenswrapper[4744]: E0311 01:08:42.404719 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5\": container with ID starting with 6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5 not found: ID does not exist" containerID="6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.404747 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5"} err="failed to get container status \"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5\": rpc error: code = NotFound desc = could not find container \"6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5\": container with ID starting with 6c9de4789455f63b4c132f30da91a8d080a54af94f42d758b116c6cef7c892e5 not found: ID does not exist" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.404769 4744 scope.go:117] "RemoveContainer" containerID="3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d" Mar 11 01:08:42 crc kubenswrapper[4744]: E0311 01:08:42.405335 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d\": container with ID starting with 3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d not found: ID does not exist" containerID="3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.405355 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d"} err="failed to get container status \"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d\": rpc error: code = NotFound desc = could not find container \"3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d\": container with ID starting with 3937dd75a53c40c97263bb3c8c77aa21d73207359270c4b42291d80991844a9d not found: ID does not exist" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.405368 4744 scope.go:117] "RemoveContainer" containerID="6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd" Mar 11 01:08:42 crc kubenswrapper[4744]: E0311 01:08:42.405774 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd\": container with ID starting with 6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd not found: ID does not exist" containerID="6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd" Mar 11 01:08:42 crc kubenswrapper[4744]: I0311 01:08:42.405794 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd"} err="failed to get container status \"6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd\": rpc error: code = NotFound desc = could not find container \"6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd\": container with ID starting with 6eae3b9bcc9ba2ca44f8da3ac194f62412ae1585d667791e5089db95896956cd not found: ID does not exist" Mar 11 01:08:43 crc kubenswrapper[4744]: I0311 01:08:43.986012 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" path="/var/lib/kubelet/pods/1879ad6e-72b6-4c58-a771-264acbcdcb34/volumes" Mar 11 01:08:53 crc kubenswrapper[4744]: I0311 01:08:53.979769 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:53 crc kubenswrapper[4744]: I0311 01:08:53.981226 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:54 crc kubenswrapper[4744]: I0311 01:08:54.301543 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lkdth"] Mar 11 01:08:54 crc kubenswrapper[4744]: W0311 01:08:54.310050 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9829f27b_c482_450d_8e09_231b8b9943bc.slice/crio-4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2 WatchSource:0}: Error finding container 4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2: Status 404 returned error can't find the container with id 4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2 Mar 11 01:08:54 crc kubenswrapper[4744]: I0311 01:08:54.413088 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lkdth" event={"ID":"9829f27b-c482-450d-8e09-231b8b9943bc","Type":"ContainerStarted","Data":"4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2"} Mar 11 01:08:56 crc kubenswrapper[4744]: I0311 01:08:56.430621 4744 generic.go:334] "Generic (PLEG): container finished" podID="9829f27b-c482-450d-8e09-231b8b9943bc" containerID="e8e7de9821f1f2395a016ec5d88815ec81f2044234a92bc5f54ebe8a619a2f2e" exitCode=0 Mar 11 01:08:56 crc kubenswrapper[4744]: I0311 01:08:56.430778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lkdth" event={"ID":"9829f27b-c482-450d-8e09-231b8b9943bc","Type":"ContainerDied","Data":"e8e7de9821f1f2395a016ec5d88815ec81f2044234a92bc5f54ebe8a619a2f2e"} Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.816308 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.951329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage\") pod \"9829f27b-c482-450d-8e09-231b8b9943bc\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.951396 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc28n\" (UniqueName: \"kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n\") pod \"9829f27b-c482-450d-8e09-231b8b9943bc\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.951484 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt\") pod \"9829f27b-c482-450d-8e09-231b8b9943bc\" (UID: \"9829f27b-c482-450d-8e09-231b8b9943bc\") " Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.951885 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9829f27b-c482-450d-8e09-231b8b9943bc" (UID: "9829f27b-c482-450d-8e09-231b8b9943bc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.953311 4744 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9829f27b-c482-450d-8e09-231b8b9943bc-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.957388 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n" (OuterVolumeSpecName: "kube-api-access-cc28n") pod "9829f27b-c482-450d-8e09-231b8b9943bc" (UID: "9829f27b-c482-450d-8e09-231b8b9943bc"). InnerVolumeSpecName "kube-api-access-cc28n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:08:57 crc kubenswrapper[4744]: I0311 01:08:57.978583 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9829f27b-c482-450d-8e09-231b8b9943bc" (UID: "9829f27b-c482-450d-8e09-231b8b9943bc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:08:58 crc kubenswrapper[4744]: I0311 01:08:58.055187 4744 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9829f27b-c482-450d-8e09-231b8b9943bc-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:58 crc kubenswrapper[4744]: I0311 01:08:58.055253 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc28n\" (UniqueName: \"kubernetes.io/projected/9829f27b-c482-450d-8e09-231b8b9943bc-kube-api-access-cc28n\") on node \"crc\" DevicePath \"\"" Mar 11 01:08:58 crc kubenswrapper[4744]: I0311 01:08:58.459091 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lkdth" event={"ID":"9829f27b-c482-450d-8e09-231b8b9943bc","Type":"ContainerDied","Data":"4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2"} Mar 11 01:08:58 crc kubenswrapper[4744]: I0311 01:08:58.459154 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lkdth" Mar 11 01:08:58 crc kubenswrapper[4744]: I0311 01:08:58.459182 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c9a6e3543836e37a4f3e61431ba19ee79a5ca84e353049e35a2893d1fa25ed2" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.347545 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:02 crc kubenswrapper[4744]: E0311 01:09:02.349248 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="extract-utilities" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.349331 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="extract-utilities" Mar 11 01:09:02 crc kubenswrapper[4744]: E0311 01:09:02.349418 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" containerName="storage" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.349587 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" containerName="storage" Mar 11 01:09:02 crc kubenswrapper[4744]: E0311 01:09:02.349717 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="registry-server" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.349818 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="registry-server" Mar 11 01:09:02 crc kubenswrapper[4744]: E0311 01:09:02.349957 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="extract-content" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.350061 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="extract-content" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.350326 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" containerName="storage" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.350442 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1879ad6e-72b6-4c58-a771-264acbcdcb34" containerName="registry-server" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.351901 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.366633 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.428588 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crj8p\" (UniqueName: \"kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.428831 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.429072 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.530449 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crj8p\" (UniqueName: \"kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.530569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.530626 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.531106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.531348 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.553477 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crj8p\" (UniqueName: \"kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p\") pod \"redhat-marketplace-6vzqs\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.676723 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:02 crc kubenswrapper[4744]: I0311 01:09:02.975267 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:03 crc kubenswrapper[4744]: I0311 01:09:03.494604 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerID="dd6817017c05b55c1ae1890111d6cc9e68a2a35c6aac8e1dfdf2334ce052f83e" exitCode=0 Mar 11 01:09:03 crc kubenswrapper[4744]: I0311 01:09:03.494679 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerDied","Data":"dd6817017c05b55c1ae1890111d6cc9e68a2a35c6aac8e1dfdf2334ce052f83e"} Mar 11 01:09:03 crc kubenswrapper[4744]: I0311 01:09:03.494723 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerStarted","Data":"4de0e8f3a98b2d0a7338be0c2e44f4c2225c1780a076b4fadff3dd212e890d78"} Mar 11 01:09:03 crc kubenswrapper[4744]: I0311 01:09:03.497609 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:09:03 crc kubenswrapper[4744]: I0311 01:09:03.913619 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wk26z" Mar 11 01:09:04 crc kubenswrapper[4744]: I0311 01:09:04.500922 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerID="c42e0282d18853709e9a77031666b8a1f4c4548db29f83119c18b36e11177d2d" exitCode=0 Mar 11 01:09:04 crc kubenswrapper[4744]: I0311 01:09:04.500964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerDied","Data":"c42e0282d18853709e9a77031666b8a1f4c4548db29f83119c18b36e11177d2d"} Mar 11 01:09:04 crc kubenswrapper[4744]: I0311 01:09:04.541002 4744 scope.go:117] "RemoveContainer" containerID="68572562ea8e88035d1531ad81c9551532026ffa6088df3126f7f356cf4f8adb" Mar 11 01:09:05 crc kubenswrapper[4744]: I0311 01:09:05.506941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerStarted","Data":"283210a8ac9f9c215e065d1d4b93bd4b6cad11bd9eb8f6ad18a579c8b331e090"} Mar 11 01:09:05 crc kubenswrapper[4744]: I0311 01:09:05.526016 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vzqs" podStartSLOduration=2.091529031 podStartE2EDuration="3.526000632s" podCreationTimestamp="2026-03-11 01:09:02 +0000 UTC" firstStartedPulling="2026-03-11 01:09:03.496730427 +0000 UTC m=+900.300948062" lastFinishedPulling="2026-03-11 01:09:04.931202058 +0000 UTC m=+901.735419663" observedRunningTime="2026-03-11 01:09:05.520928129 +0000 UTC m=+902.325145744" watchObservedRunningTime="2026-03-11 01:09:05.526000632 +0000 UTC m=+902.330218237" Mar 11 01:09:05 crc kubenswrapper[4744]: I0311 01:09:05.999550 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r"] Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.001297 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.005552 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.009823 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r"] Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.073805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ftd\" (UniqueName: \"kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.073974 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.074017 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.174703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.174798 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.174855 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ftd\" (UniqueName: \"kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.175226 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.175978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.200641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ftd\" (UniqueName: \"kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.316594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:06 crc kubenswrapper[4744]: I0311 01:09:06.613358 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r"] Mar 11 01:09:06 crc kubenswrapper[4744]: W0311 01:09:06.625823 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5525fbf_f26b_400d_bcb1_1489bcfc7476.slice/crio-f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9 WatchSource:0}: Error finding container f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9: Status 404 returned error can't find the container with id f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9 Mar 11 01:09:07 crc kubenswrapper[4744]: I0311 01:09:07.524982 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerStarted","Data":"c9b08735e4dfccda2b6c7e50e6462e2876b2560f3c12bb204cb09825f7cbce0a"} Mar 11 01:09:07 crc kubenswrapper[4744]: I0311 01:09:07.525045 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerStarted","Data":"f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9"} Mar 11 01:09:08 crc kubenswrapper[4744]: I0311 01:09:08.531075 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerID="c9b08735e4dfccda2b6c7e50e6462e2876b2560f3c12bb204cb09825f7cbce0a" exitCode=0 Mar 11 01:09:08 crc kubenswrapper[4744]: I0311 01:09:08.531120 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerDied","Data":"c9b08735e4dfccda2b6c7e50e6462e2876b2560f3c12bb204cb09825f7cbce0a"} Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.134973 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.140131 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.148725 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.156752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrjs\" (UniqueName: \"kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.156899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.157095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.259922 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrjs\" (UniqueName: \"kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.259989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.260054 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.260620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.260653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.289571 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrjs\" (UniqueName: \"kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs\") pod \"redhat-operators-dj4q5\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.481953 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:09 crc kubenswrapper[4744]: I0311 01:09:09.685684 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:10 crc kubenswrapper[4744]: I0311 01:09:10.542312 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerID="735ee968d7c7b770c2f217df539137495652295d8c1d5303a47bfc8a9ced4dd0" exitCode=0 Mar 11 01:09:10 crc kubenswrapper[4744]: I0311 01:09:10.542383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerDied","Data":"735ee968d7c7b770c2f217df539137495652295d8c1d5303a47bfc8a9ced4dd0"} Mar 11 01:09:10 crc kubenswrapper[4744]: I0311 01:09:10.545180 4744 generic.go:334] "Generic (PLEG): container finished" podID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerID="7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3" exitCode=0 Mar 11 01:09:10 crc kubenswrapper[4744]: I0311 01:09:10.545233 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerDied","Data":"7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3"} Mar 11 01:09:10 crc kubenswrapper[4744]: I0311 01:09:10.545271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerStarted","Data":"7d146f0a32c82b869de11142176da423cd5be4c8886fbc2277f99cf93b1fa5f3"} Mar 11 01:09:11 crc kubenswrapper[4744]: I0311 01:09:11.561010 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerID="e1b0f4b734d5fc246d53a26af4d542d4c7d84fd21d86578f425400a59a4fb308" exitCode=0 Mar 11 01:09:11 crc kubenswrapper[4744]: I0311 01:09:11.561096 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerDied","Data":"e1b0f4b734d5fc246d53a26af4d542d4c7d84fd21d86578f425400a59a4fb308"} Mar 11 01:09:11 crc kubenswrapper[4744]: I0311 01:09:11.565965 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerStarted","Data":"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1"} Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.577018 4744 generic.go:334] "Generic (PLEG): container finished" podID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerID="8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1" exitCode=0 Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.577152 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerDied","Data":"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1"} Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.677634 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.677719 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.734629 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:12 crc kubenswrapper[4744]: I0311 01:09:12.858863 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.006560 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util\") pod \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.006610 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ftd\" (UniqueName: \"kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd\") pod \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.006651 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle\") pod \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\" (UID: \"e5525fbf-f26b-400d-bcb1-1489bcfc7476\") " Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.007582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle" (OuterVolumeSpecName: "bundle") pod "e5525fbf-f26b-400d-bcb1-1489bcfc7476" (UID: "e5525fbf-f26b-400d-bcb1-1489bcfc7476"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.013713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd" (OuterVolumeSpecName: "kube-api-access-b9ftd") pod "e5525fbf-f26b-400d-bcb1-1489bcfc7476" (UID: "e5525fbf-f26b-400d-bcb1-1489bcfc7476"). InnerVolumeSpecName "kube-api-access-b9ftd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.051305 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util" (OuterVolumeSpecName: "util") pod "e5525fbf-f26b-400d-bcb1-1489bcfc7476" (UID: "e5525fbf-f26b-400d-bcb1-1489bcfc7476"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.108169 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-util\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.108452 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ftd\" (UniqueName: \"kubernetes.io/projected/e5525fbf-f26b-400d-bcb1-1489bcfc7476-kube-api-access-b9ftd\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.108560 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5525fbf-f26b-400d-bcb1-1489bcfc7476-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.586376 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerStarted","Data":"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5"} Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.589722 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" event={"ID":"e5525fbf-f26b-400d-bcb1-1489bcfc7476","Type":"ContainerDied","Data":"f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9"} Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.589775 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.589794 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f04e81045546a918131aeec882e84397f82d4346eb5cf42b94679b53a1896ff9" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.634655 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dj4q5" podStartSLOduration=2.184186602 podStartE2EDuration="4.634636034s" podCreationTimestamp="2026-03-11 01:09:09 +0000 UTC" firstStartedPulling="2026-03-11 01:09:10.546904697 +0000 UTC m=+907.351122332" lastFinishedPulling="2026-03-11 01:09:12.997354119 +0000 UTC m=+909.801571764" observedRunningTime="2026-03-11 01:09:13.631236457 +0000 UTC m=+910.435454122" watchObservedRunningTime="2026-03-11 01:09:13.634636034 +0000 UTC m=+910.438853649" Mar 11 01:09:13 crc kubenswrapper[4744]: I0311 01:09:13.658568 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.323418 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.324056 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vzqs" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="registry-server" containerID="cri-o://283210a8ac9f9c215e065d1d4b93bd4b6cad11bd9eb8f6ad18a579c8b331e090" gracePeriod=2 Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.563663 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz"] Mar 11 01:09:16 crc kubenswrapper[4744]: E0311 01:09:16.563851 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="pull" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.563862 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="pull" Mar 11 01:09:16 crc kubenswrapper[4744]: E0311 01:09:16.563874 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="extract" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.563879 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="extract" Mar 11 01:09:16 crc kubenswrapper[4744]: E0311 01:09:16.563890 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="util" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.563896 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="util" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.563982 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5525fbf-f26b-400d-bcb1-1489bcfc7476" containerName="extract" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.564335 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.567340 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.567639 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.567766 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gjf88" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.581089 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz"] Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.613685 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerID="283210a8ac9f9c215e065d1d4b93bd4b6cad11bd9eb8f6ad18a579c8b331e090" exitCode=0 Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.613723 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerDied","Data":"283210a8ac9f9c215e065d1d4b93bd4b6cad11bd9eb8f6ad18a579c8b331e090"} Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.690630 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.758224 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content\") pod \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.758432 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtxr\" (UniqueName: \"kubernetes.io/projected/385a32da-b61b-4128-b192-6ad240a2a6e8-kube-api-access-4mtxr\") pod \"nmstate-operator-75c5dccd6c-lc2cz\" (UID: \"385a32da-b61b-4128-b192-6ad240a2a6e8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.806671 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c39771d-8bc9-4df1-9b7c-e1585ec3b076" (UID: "7c39771d-8bc9-4df1-9b7c-e1585ec3b076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.858930 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crj8p\" (UniqueName: \"kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p\") pod \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.858972 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities\") pod \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\" (UID: \"7c39771d-8bc9-4df1-9b7c-e1585ec3b076\") " Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.859172 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtxr\" (UniqueName: \"kubernetes.io/projected/385a32da-b61b-4128-b192-6ad240a2a6e8-kube-api-access-4mtxr\") pod \"nmstate-operator-75c5dccd6c-lc2cz\" (UID: \"385a32da-b61b-4128-b192-6ad240a2a6e8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.859251 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.860345 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities" (OuterVolumeSpecName: "utilities") pod "7c39771d-8bc9-4df1-9b7c-e1585ec3b076" (UID: "7c39771d-8bc9-4df1-9b7c-e1585ec3b076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.883808 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p" (OuterVolumeSpecName: "kube-api-access-crj8p") pod "7c39771d-8bc9-4df1-9b7c-e1585ec3b076" (UID: "7c39771d-8bc9-4df1-9b7c-e1585ec3b076"). InnerVolumeSpecName "kube-api-access-crj8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.889788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtxr\" (UniqueName: \"kubernetes.io/projected/385a32da-b61b-4128-b192-6ad240a2a6e8-kube-api-access-4mtxr\") pod \"nmstate-operator-75c5dccd6c-lc2cz\" (UID: \"385a32da-b61b-4128-b192-6ad240a2a6e8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.960119 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crj8p\" (UniqueName: \"kubernetes.io/projected/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-kube-api-access-crj8p\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:16 crc kubenswrapper[4744]: I0311 01:09:16.960167 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c39771d-8bc9-4df1-9b7c-e1585ec3b076-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.176594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.621914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vzqs" event={"ID":"7c39771d-8bc9-4df1-9b7c-e1585ec3b076","Type":"ContainerDied","Data":"4de0e8f3a98b2d0a7338be0c2e44f4c2225c1780a076b4fadff3dd212e890d78"} Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.622248 4744 scope.go:117] "RemoveContainer" containerID="283210a8ac9f9c215e065d1d4b93bd4b6cad11bd9eb8f6ad18a579c8b331e090" Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.621997 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vzqs" Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.642370 4744 scope.go:117] "RemoveContainer" containerID="c42e0282d18853709e9a77031666b8a1f4c4548db29f83119c18b36e11177d2d" Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.656731 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.660818 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vzqs"] Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.672082 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz"] Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.678608 4744 scope.go:117] "RemoveContainer" containerID="dd6817017c05b55c1ae1890111d6cc9e68a2a35c6aac8e1dfdf2334ce052f83e" Mar 11 01:09:17 crc kubenswrapper[4744]: W0311 01:09:17.690095 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod385a32da_b61b_4128_b192_6ad240a2a6e8.slice/crio-110b92479d1d79b0e1c8c0866683607e7fb55a1f7e1fa8c4e0f73caaea356534 WatchSource:0}: Error finding container 110b92479d1d79b0e1c8c0866683607e7fb55a1f7e1fa8c4e0f73caaea356534: Status 404 returned error can't find the container with id 110b92479d1d79b0e1c8c0866683607e7fb55a1f7e1fa8c4e0f73caaea356534 Mar 11 01:09:17 crc kubenswrapper[4744]: I0311 01:09:17.982744 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" path="/var/lib/kubelet/pods/7c39771d-8bc9-4df1-9b7c-e1585ec3b076/volumes" Mar 11 01:09:18 crc kubenswrapper[4744]: I0311 01:09:18.631950 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" event={"ID":"385a32da-b61b-4128-b192-6ad240a2a6e8","Type":"ContainerStarted","Data":"110b92479d1d79b0e1c8c0866683607e7fb55a1f7e1fa8c4e0f73caaea356534"} Mar 11 01:09:19 crc kubenswrapper[4744]: I0311 01:09:19.482424 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:19 crc kubenswrapper[4744]: I0311 01:09:19.482560 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:20 crc kubenswrapper[4744]: I0311 01:09:20.528835 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dj4q5" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="registry-server" probeResult="failure" output=< Mar 11 01:09:20 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:09:20 crc kubenswrapper[4744]: > Mar 11 01:09:20 crc kubenswrapper[4744]: I0311 01:09:20.645758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" event={"ID":"385a32da-b61b-4128-b192-6ad240a2a6e8","Type":"ContainerStarted","Data":"16dcabf9de289d793aaff9321a59a42abae5e1dab6b67afcebf36d185c0b6595"} Mar 11 01:09:20 crc kubenswrapper[4744]: I0311 01:09:20.665495 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lc2cz" podStartSLOduration=2.027389181 podStartE2EDuration="4.665470766s" podCreationTimestamp="2026-03-11 01:09:16 +0000 UTC" firstStartedPulling="2026-03-11 01:09:17.694008621 +0000 UTC m=+914.498226226" lastFinishedPulling="2026-03-11 01:09:20.332090206 +0000 UTC m=+917.136307811" observedRunningTime="2026-03-11 01:09:20.659449347 +0000 UTC m=+917.463667022" watchObservedRunningTime="2026-03-11 01:09:20.665470766 +0000 UTC m=+917.469688411" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.174258 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-k2dqq"] Mar 11 01:09:26 crc kubenswrapper[4744]: E0311 01:09:26.176042 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="extract-content" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.176131 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="extract-content" Mar 11 01:09:26 crc kubenswrapper[4744]: E0311 01:09:26.176203 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="registry-server" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.176269 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="registry-server" Mar 11 01:09:26 crc kubenswrapper[4744]: E0311 01:09:26.176353 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="extract-utilities" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.176419 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="extract-utilities" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.176653 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c39771d-8bc9-4df1-9b7c-e1585ec3b076" containerName="registry-server" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.177309 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.177454 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.178158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.179994 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4gtjr"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.180661 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.184110 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.184289 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-p6tx2" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.187614 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-k2dqq"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.224021 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284611 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cc11049-1fae-4c88-acad-91cb1622c0bc-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284672 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b784d\" (UniqueName: \"kubernetes.io/projected/5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc-kube-api-access-b784d\") pod \"nmstate-metrics-69594cc75-k2dqq\" (UID: \"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284711 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-dbus-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284771 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-ovs-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl66m\" (UniqueName: \"kubernetes.io/projected/61689687-6d5f-4f04-bd77-cf749c0a77ee-kube-api-access-tl66m\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284912 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-nmstate-lock\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.284959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jgk\" (UniqueName: \"kubernetes.io/projected/8cc11049-1fae-4c88-acad-91cb1622c0bc-kube-api-access-76jgk\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.307995 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.308570 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.312197 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.312916 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.313875 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9xtkn" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.319656 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cc11049-1fae-4c88-acad-91cb1622c0bc-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385820 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b784d\" (UniqueName: \"kubernetes.io/projected/5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc-kube-api-access-b784d\") pod \"nmstate-metrics-69594cc75-k2dqq\" (UID: \"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385864 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-dbus-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-ovs-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385929 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl66m\" (UniqueName: \"kubernetes.io/projected/61689687-6d5f-4f04-bd77-cf749c0a77ee-kube-api-access-tl66m\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385970 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-nmstate-lock\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.385997 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jgk\" (UniqueName: \"kubernetes.io/projected/8cc11049-1fae-4c88-acad-91cb1622c0bc-kube-api-access-76jgk\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.387314 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-ovs-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.387318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-dbus-socket\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.387856 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61689687-6d5f-4f04-bd77-cf749c0a77ee-nmstate-lock\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.403936 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cc11049-1fae-4c88-acad-91cb1622c0bc-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.422299 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jgk\" (UniqueName: \"kubernetes.io/projected/8cc11049-1fae-4c88-acad-91cb1622c0bc-kube-api-access-76jgk\") pod \"nmstate-webhook-786f45cff4-bm8mj\" (UID: \"8cc11049-1fae-4c88-acad-91cb1622c0bc\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.422349 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl66m\" (UniqueName: \"kubernetes.io/projected/61689687-6d5f-4f04-bd77-cf749c0a77ee-kube-api-access-tl66m\") pod \"nmstate-handler-4gtjr\" (UID: \"61689687-6d5f-4f04-bd77-cf749c0a77ee\") " pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.424849 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b784d\" (UniqueName: \"kubernetes.io/projected/5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc-kube-api-access-b784d\") pod \"nmstate-metrics-69594cc75-k2dqq\" (UID: \"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.486969 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.487022 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3af6f71e-7739-4a00-8f26-42d043c0d179-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.487059 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6b6b\" (UniqueName: \"kubernetes.io/projected/3af6f71e-7739-4a00-8f26-42d043c0d179-kube-api-access-c6b6b\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.494598 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.503682 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.510868 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.515428 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84f9d695fc-xlj7v"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.516447 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.524591 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84f9d695fc-xlj7v"] Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.589093 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.589138 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3af6f71e-7739-4a00-8f26-42d043c0d179-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.589176 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6b6b\" (UniqueName: \"kubernetes.io/projected/3af6f71e-7739-4a00-8f26-42d043c0d179-kube-api-access-c6b6b\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: E0311 01:09:26.590093 4744 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 11 01:09:26 crc kubenswrapper[4744]: E0311 01:09:26.590151 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert podName:3af6f71e-7739-4a00-8f26-42d043c0d179 nodeName:}" failed. No retries permitted until 2026-03-11 01:09:27.090135196 +0000 UTC m=+923.894352821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-6jfrf" (UID: "3af6f71e-7739-4a00-8f26-42d043c0d179") : secret "plugin-serving-cert" not found Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.590733 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3af6f71e-7739-4a00-8f26-42d043c0d179-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.611204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6b6b\" (UniqueName: \"kubernetes.io/projected/3af6f71e-7739-4a00-8f26-42d043c0d179-kube-api-access-c6b6b\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.689700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4gtjr" event={"ID":"61689687-6d5f-4f04-bd77-cf749c0a77ee","Type":"ContainerStarted","Data":"67bb8f84909b41af638dc0dc3ccd7f20430c1fa29620d018ff0e13cf4c4eeb49"} Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-oauth-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691067 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-oauth-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-console-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-trusted-ca-bundle\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691158 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-service-ca\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.691208 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbfz\" (UniqueName: \"kubernetes.io/projected/62188ad5-0dce-407b-8635-b65f1af39dec-kube-api-access-hrbfz\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.712990 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-k2dqq"] Mar 11 01:09:26 crc kubenswrapper[4744]: W0311 01:09:26.720339 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd864c8_f5b3_4d6d_a9b0_85bb8661f5dc.slice/crio-47f553b8c980929985fad092149246f6470780869a5476a9e3e05ca61bd86855 WatchSource:0}: Error finding container 47f553b8c980929985fad092149246f6470780869a5476a9e3e05ca61bd86855: Status 404 returned error can't find the container with id 47f553b8c980929985fad092149246f6470780869a5476a9e3e05ca61bd86855 Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.749655 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj"] Mar 11 01:09:26 crc kubenswrapper[4744]: W0311 01:09:26.755722 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc11049_1fae_4c88_acad_91cb1622c0bc.slice/crio-86d532bdb947bf122b092bbdd0d0d1cfb865f1ea90cbcbe60215f90494a2bff4 WatchSource:0}: Error finding container 86d532bdb947bf122b092bbdd0d0d1cfb865f1ea90cbcbe60215f90494a2bff4: Status 404 returned error can't find the container with id 86d532bdb947bf122b092bbdd0d0d1cfb865f1ea90cbcbe60215f90494a2bff4 Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.791914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-console-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.791994 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-trusted-ca-bundle\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793185 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793338 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-service-ca\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793369 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbfz\" (UniqueName: \"kubernetes.io/projected/62188ad5-0dce-407b-8635-b65f1af39dec-kube-api-access-hrbfz\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793429 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-trusted-ca-bundle\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793443 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-oauth-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.793500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-oauth-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.794129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-service-ca\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.794287 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-oauth-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.795068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62188ad5-0dce-407b-8635-b65f1af39dec-console-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.799132 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-oauth-config\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.799179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62188ad5-0dce-407b-8635-b65f1af39dec-console-serving-cert\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.807772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbfz\" (UniqueName: \"kubernetes.io/projected/62188ad5-0dce-407b-8635-b65f1af39dec-kube-api-access-hrbfz\") pod \"console-84f9d695fc-xlj7v\" (UID: \"62188ad5-0dce-407b-8635-b65f1af39dec\") " pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:26 crc kubenswrapper[4744]: I0311 01:09:26.869057 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.104967 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84f9d695fc-xlj7v"] Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.106196 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.123147 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3af6f71e-7739-4a00-8f26-42d043c0d179-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6jfrf\" (UID: \"3af6f71e-7739-4a00-8f26-42d043c0d179\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.221213 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.524729 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf"] Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.698632 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f9d695fc-xlj7v" event={"ID":"62188ad5-0dce-407b-8635-b65f1af39dec","Type":"ContainerStarted","Data":"8121478d6ede961e3cb49da17510d01d8a2757fe33e1bd9b9635b3820649fe65"} Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.698699 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f9d695fc-xlj7v" event={"ID":"62188ad5-0dce-407b-8635-b65f1af39dec","Type":"ContainerStarted","Data":"0ed011fe337abbb3d9c02d8bcc05e74ab9b185e4255a169e5c7f05616a2f05dc"} Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.702003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" event={"ID":"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc","Type":"ContainerStarted","Data":"47f553b8c980929985fad092149246f6470780869a5476a9e3e05ca61bd86855"} Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.703666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" event={"ID":"3af6f71e-7739-4a00-8f26-42d043c0d179","Type":"ContainerStarted","Data":"6f7b19246258e6505b21d73a7b5bd02d5ec9e4bdb9d9294158080b32eeb457af"} Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.704974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" event={"ID":"8cc11049-1fae-4c88-acad-91cb1622c0bc","Type":"ContainerStarted","Data":"86d532bdb947bf122b092bbdd0d0d1cfb865f1ea90cbcbe60215f90494a2bff4"} Mar 11 01:09:27 crc kubenswrapper[4744]: I0311 01:09:27.728734 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84f9d695fc-xlj7v" podStartSLOduration=1.728708547 podStartE2EDuration="1.728708547s" podCreationTimestamp="2026-03-11 01:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:09:27.726488617 +0000 UTC m=+924.530706252" watchObservedRunningTime="2026-03-11 01:09:27.728708547 +0000 UTC m=+924.532926192" Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.562691 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.623461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.716188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" event={"ID":"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc","Type":"ContainerStarted","Data":"82a9bb2ebff4be475f844ce8a8f419a6a53599ea8cdaf5ffff3332021856f8eb"} Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.720279 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" event={"ID":"8cc11049-1fae-4c88-acad-91cb1622c0bc","Type":"ContainerStarted","Data":"8a580fd98ae18cc13fc9ba72e55a253621aac510df32ca23998d89262896350d"} Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.720492 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:09:29 crc kubenswrapper[4744]: I0311 01:09:29.735254 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" podStartSLOduration=1.151441835 podStartE2EDuration="3.735238618s" podCreationTimestamp="2026-03-11 01:09:26 +0000 UTC" firstStartedPulling="2026-03-11 01:09:26.758109026 +0000 UTC m=+923.562326641" lastFinishedPulling="2026-03-11 01:09:29.341905779 +0000 UTC m=+926.146123424" observedRunningTime="2026-03-11 01:09:29.734647281 +0000 UTC m=+926.538864886" watchObservedRunningTime="2026-03-11 01:09:29.735238618 +0000 UTC m=+926.539456213" Mar 11 01:09:30 crc kubenswrapper[4744]: I0311 01:09:30.728100 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4gtjr" event={"ID":"61689687-6d5f-4f04-bd77-cf749c0a77ee","Type":"ContainerStarted","Data":"2e6d5ff9efb320ef6b0be6ed113c0e019be791d86adf9792b856454a17629791"} Mar 11 01:09:30 crc kubenswrapper[4744]: I0311 01:09:30.728503 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:30 crc kubenswrapper[4744]: I0311 01:09:30.730592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" event={"ID":"3af6f71e-7739-4a00-8f26-42d043c0d179","Type":"ContainerStarted","Data":"500cdfe6deae564084cb5abcceda5fc8e3c515a71d26c6af6d6d344e3204788a"} Mar 11 01:09:30 crc kubenswrapper[4744]: I0311 01:09:30.745820 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4gtjr" podStartSLOduration=1.950942108 podStartE2EDuration="4.745804704s" podCreationTimestamp="2026-03-11 01:09:26 +0000 UTC" firstStartedPulling="2026-03-11 01:09:26.575257068 +0000 UTC m=+923.379474673" lastFinishedPulling="2026-03-11 01:09:29.370119624 +0000 UTC m=+926.174337269" observedRunningTime="2026-03-11 01:09:30.745057211 +0000 UTC m=+927.549274846" watchObservedRunningTime="2026-03-11 01:09:30.745804704 +0000 UTC m=+927.550022309" Mar 11 01:09:30 crc kubenswrapper[4744]: I0311 01:09:30.775638 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6jfrf" podStartSLOduration=1.785473667 podStartE2EDuration="4.775614889s" podCreationTimestamp="2026-03-11 01:09:26 +0000 UTC" firstStartedPulling="2026-03-11 01:09:27.532527191 +0000 UTC m=+924.336744796" lastFinishedPulling="2026-03-11 01:09:30.522668373 +0000 UTC m=+927.326886018" observedRunningTime="2026-03-11 01:09:30.768910819 +0000 UTC m=+927.573128464" watchObservedRunningTime="2026-03-11 01:09:30.775614889 +0000 UTC m=+927.579832504" Mar 11 01:09:31 crc kubenswrapper[4744]: I0311 01:09:31.919448 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:31 crc kubenswrapper[4744]: I0311 01:09:31.919862 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dj4q5" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="registry-server" containerID="cri-o://929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5" gracePeriod=2 Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.376289 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.422842 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities\") pod \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.422902 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnrjs\" (UniqueName: \"kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs\") pod \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.422940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content\") pod \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\" (UID: \"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55\") " Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.425119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities" (OuterVolumeSpecName: "utilities") pod "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" (UID: "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.439896 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs" (OuterVolumeSpecName: "kube-api-access-qnrjs") pod "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" (UID: "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55"). InnerVolumeSpecName "kube-api-access-qnrjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.524338 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.524390 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnrjs\" (UniqueName: \"kubernetes.io/projected/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-kube-api-access-qnrjs\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.615144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" (UID: "e8d0e1ba-30c3-42d9-aefa-4acde61a0a55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.625677 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.749856 4744 generic.go:334] "Generic (PLEG): container finished" podID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerID="929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5" exitCode=0 Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.749951 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj4q5" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.749964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerDied","Data":"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5"} Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.750005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj4q5" event={"ID":"e8d0e1ba-30c3-42d9-aefa-4acde61a0a55","Type":"ContainerDied","Data":"7d146f0a32c82b869de11142176da423cd5be4c8886fbc2277f99cf93b1fa5f3"} Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.750034 4744 scope.go:117] "RemoveContainer" containerID="929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.754556 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" event={"ID":"5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc","Type":"ContainerStarted","Data":"b3a3b17063de572db6e2bf7f447308a00b71049a5b845af0f0489c758d6a8414"} Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.779255 4744 scope.go:117] "RemoveContainer" containerID="8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.800086 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-k2dqq" podStartSLOduration=1.775188776 podStartE2EDuration="6.800062004s" podCreationTimestamp="2026-03-11 01:09:26 +0000 UTC" firstStartedPulling="2026-03-11 01:09:26.722156528 +0000 UTC m=+923.526374133" lastFinishedPulling="2026-03-11 01:09:31.747029756 +0000 UTC m=+928.551247361" observedRunningTime="2026-03-11 01:09:32.783913856 +0000 UTC m=+929.588131491" watchObservedRunningTime="2026-03-11 01:09:32.800062004 +0000 UTC m=+929.604279649" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.809465 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.816220 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dj4q5"] Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.844360 4744 scope.go:117] "RemoveContainer" containerID="7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.875772 4744 scope.go:117] "RemoveContainer" containerID="929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5" Mar 11 01:09:32 crc kubenswrapper[4744]: E0311 01:09:32.876641 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5\": container with ID starting with 929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5 not found: ID does not exist" containerID="929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.876681 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5"} err="failed to get container status \"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5\": rpc error: code = NotFound desc = could not find container \"929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5\": container with ID starting with 929b251d089cb8e810d9ec449c99ff2f4f419ad14fc29af5467211d230a4b7e5 not found: ID does not exist" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.876707 4744 scope.go:117] "RemoveContainer" containerID="8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1" Mar 11 01:09:32 crc kubenswrapper[4744]: E0311 01:09:32.877168 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1\": container with ID starting with 8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1 not found: ID does not exist" containerID="8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.877205 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1"} err="failed to get container status \"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1\": rpc error: code = NotFound desc = could not find container \"8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1\": container with ID starting with 8910bbf135af2129e06f5e476ef7417fce97fbef081057928bc8cde493101ec1 not found: ID does not exist" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.877231 4744 scope.go:117] "RemoveContainer" containerID="7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3" Mar 11 01:09:32 crc kubenswrapper[4744]: E0311 01:09:32.877831 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3\": container with ID starting with 7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3 not found: ID does not exist" containerID="7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3" Mar 11 01:09:32 crc kubenswrapper[4744]: I0311 01:09:32.877857 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3"} err="failed to get container status \"7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3\": rpc error: code = NotFound desc = could not find container \"7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3\": container with ID starting with 7f07130361942a6f8dbe5824130fdb66c428a9ae0098013f3316256683ba4cc3 not found: ID does not exist" Mar 11 01:09:33 crc kubenswrapper[4744]: I0311 01:09:33.987350 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" path="/var/lib/kubelet/pods/e8d0e1ba-30c3-42d9-aefa-4acde61a0a55/volumes" Mar 11 01:09:36 crc kubenswrapper[4744]: I0311 01:09:36.545720 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4gtjr" Mar 11 01:09:36 crc kubenswrapper[4744]: I0311 01:09:36.870184 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:36 crc kubenswrapper[4744]: I0311 01:09:36.870252 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:36 crc kubenswrapper[4744]: I0311 01:09:36.878487 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:37 crc kubenswrapper[4744]: I0311 01:09:37.800627 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84f9d695fc-xlj7v" Mar 11 01:09:37 crc kubenswrapper[4744]: I0311 01:09:37.883068 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 01:09:46 crc kubenswrapper[4744]: I0311 01:09:46.513672 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bm8mj" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.136269 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553190-9nkn2"] Mar 11 01:10:00 crc kubenswrapper[4744]: E0311 01:10:00.137999 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="registry-server" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.138023 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="registry-server" Mar 11 01:10:00 crc kubenswrapper[4744]: E0311 01:10:00.138043 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="extract-content" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.138055 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="extract-content" Mar 11 01:10:00 crc kubenswrapper[4744]: E0311 01:10:00.138076 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="extract-utilities" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.138089 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="extract-utilities" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.138278 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d0e1ba-30c3-42d9-aefa-4acde61a0a55" containerName="registry-server" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.138844 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.141630 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.141719 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.142697 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.143832 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553190-9nkn2"] Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.178456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2qgf\" (UniqueName: \"kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf\") pod \"auto-csr-approver-29553190-9nkn2\" (UID: \"60e9671f-6963-4873-8357-2580e9b768f0\") " pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.279947 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2qgf\" (UniqueName: \"kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf\") pod \"auto-csr-approver-29553190-9nkn2\" (UID: \"60e9671f-6963-4873-8357-2580e9b768f0\") " pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.323565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2qgf\" (UniqueName: \"kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf\") pod \"auto-csr-approver-29553190-9nkn2\" (UID: \"60e9671f-6963-4873-8357-2580e9b768f0\") " pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.477603 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.688651 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4"] Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.690258 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.695001 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.704701 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4"] Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.787655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.787728 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdm8\" (UniqueName: \"kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.787759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.888640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.888703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdm8\" (UniqueName: \"kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.888723 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.889250 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.889291 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.913976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdm8\" (UniqueName: \"kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:00 crc kubenswrapper[4744]: I0311 01:10:00.971077 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553190-9nkn2"] Mar 11 01:10:00 crc kubenswrapper[4744]: W0311 01:10:00.982944 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e9671f_6963_4873_8357_2580e9b768f0.slice/crio-1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c WatchSource:0}: Error finding container 1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c: Status 404 returned error can't find the container with id 1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.024234 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.335993 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4"] Mar 11 01:10:01 crc kubenswrapper[4744]: W0311 01:10:01.345729 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f80f5b_94aa_4852_a041_427b37320e97.slice/crio-5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0 WatchSource:0}: Error finding container 5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0: Status 404 returned error can't find the container with id 5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0 Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.958419 4744 generic.go:334] "Generic (PLEG): container finished" podID="38f80f5b-94aa-4852-a041-427b37320e97" containerID="0ec56fc0878cabd6d4e2484f8ccb092159de71e1e77b49bcaacb8232339a5edf" exitCode=0 Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.958587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" event={"ID":"38f80f5b-94aa-4852-a041-427b37320e97","Type":"ContainerDied","Data":"0ec56fc0878cabd6d4e2484f8ccb092159de71e1e77b49bcaacb8232339a5edf"} Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.958629 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" event={"ID":"38f80f5b-94aa-4852-a041-427b37320e97","Type":"ContainerStarted","Data":"5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0"} Mar 11 01:10:01 crc kubenswrapper[4744]: I0311 01:10:01.961099 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" event={"ID":"60e9671f-6963-4873-8357-2580e9b768f0","Type":"ContainerStarted","Data":"1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c"} Mar 11 01:10:02 crc kubenswrapper[4744]: I0311 01:10:02.952431 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-msd9d" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerName="console" containerID="cri-o://aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773" gracePeriod=15 Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.423956 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-msd9d_0d55c2f3-2eef-42eb-8627-ccd40e21f4d0/console/0.log" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.424324 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529382 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529430 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529471 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmsqr\" (UniqueName: \"kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529507 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529561 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.529627 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config\") pod \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\" (UID: \"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0\") " Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.531049 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.531156 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config" (OuterVolumeSpecName: "console-config") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.531168 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.531879 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.536320 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.536759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.537386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr" (OuterVolumeSpecName: "kube-api-access-qmsqr") pod "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" (UID: "0d55c2f3-2eef-42eb-8627-ccd40e21f4d0"). InnerVolumeSpecName "kube-api-access-qmsqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.631766 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632080 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632378 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632601 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632747 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmsqr\" (UniqueName: \"kubernetes.io/projected/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-kube-api-access-qmsqr\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632865 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.632989 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.982015 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-msd9d_0d55c2f3-2eef-42eb-8627-ccd40e21f4d0/console/0.log" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.983760 4744 generic.go:334] "Generic (PLEG): container finished" podID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerID="aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773" exitCode=2 Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.983912 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-msd9d" Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.988025 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-msd9d" event={"ID":"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0","Type":"ContainerDied","Data":"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773"} Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.988083 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-msd9d" event={"ID":"0d55c2f3-2eef-42eb-8627-ccd40e21f4d0","Type":"ContainerDied","Data":"5b1437991fe6641c1aee417825e707901e69a59a9c2e19ee5e372209c97a5570"} Mar 11 01:10:03 crc kubenswrapper[4744]: I0311 01:10:03.988115 4744 scope.go:117] "RemoveContainer" containerID="aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773" Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.042946 4744 scope.go:117] "RemoveContainer" containerID="aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773" Mar 11 01:10:04 crc kubenswrapper[4744]: E0311 01:10:04.043706 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773\": container with ID starting with aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773 not found: ID does not exist" containerID="aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773" Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.043757 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773"} err="failed to get container status \"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773\": rpc error: code = NotFound desc = could not find container \"aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773\": container with ID starting with aea920f6b334048eb7438150345cab9fae13e3724c414aabe885b3c926bed773 not found: ID does not exist" Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.047591 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.053861 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-msd9d"] Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.994727 4744 generic.go:334] "Generic (PLEG): container finished" podID="60e9671f-6963-4873-8357-2580e9b768f0" containerID="eb0d337444e47aa6e31605c97ea9af0ca01d4b30374bc3a640200fbe0eae7572" exitCode=0 Mar 11 01:10:04 crc kubenswrapper[4744]: I0311 01:10:04.994773 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" event={"ID":"60e9671f-6963-4873-8357-2580e9b768f0","Type":"ContainerDied","Data":"eb0d337444e47aa6e31605c97ea9af0ca01d4b30374bc3a640200fbe0eae7572"} Mar 11 01:10:05 crc kubenswrapper[4744]: I0311 01:10:05.987299 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" path="/var/lib/kubelet/pods/0d55c2f3-2eef-42eb-8627-ccd40e21f4d0/volumes" Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.004870 4744 generic.go:334] "Generic (PLEG): container finished" podID="38f80f5b-94aa-4852-a041-427b37320e97" containerID="455b1e6fe10c313aa30c763a12b74636f5b6dc867db8c318468807392b70187f" exitCode=0 Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.004957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" event={"ID":"38f80f5b-94aa-4852-a041-427b37320e97","Type":"ContainerDied","Data":"455b1e6fe10c313aa30c763a12b74636f5b6dc867db8c318468807392b70187f"} Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.354600 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.480772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2qgf\" (UniqueName: \"kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf\") pod \"60e9671f-6963-4873-8357-2580e9b768f0\" (UID: \"60e9671f-6963-4873-8357-2580e9b768f0\") " Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.485099 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf" (OuterVolumeSpecName: "kube-api-access-l2qgf") pod "60e9671f-6963-4873-8357-2580e9b768f0" (UID: "60e9671f-6963-4873-8357-2580e9b768f0"). InnerVolumeSpecName "kube-api-access-l2qgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:10:06 crc kubenswrapper[4744]: I0311 01:10:06.581775 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2qgf\" (UniqueName: \"kubernetes.io/projected/60e9671f-6963-4873-8357-2580e9b768f0-kube-api-access-l2qgf\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.016200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" event={"ID":"60e9671f-6963-4873-8357-2580e9b768f0","Type":"ContainerDied","Data":"1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c"} Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.016260 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8755b214319723bf637142a56e075344f287d75c104f13dc9a83545591bc2c" Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.016334 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553190-9nkn2" Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.023155 4744 generic.go:334] "Generic (PLEG): container finished" podID="38f80f5b-94aa-4852-a041-427b37320e97" containerID="aec8449ca916f4dae5d06d93238395b2b93c87938484376713581c062504fdc5" exitCode=0 Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.023242 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" event={"ID":"38f80f5b-94aa-4852-a041-427b37320e97","Type":"ContainerDied","Data":"aec8449ca916f4dae5d06d93238395b2b93c87938484376713581c062504fdc5"} Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.415137 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553184-gvb9b"] Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.420863 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553184-gvb9b"] Mar 11 01:10:07 crc kubenswrapper[4744]: I0311 01:10:07.987236 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9acbc58-725d-49da-8110-54a476725dbc" path="/var/lib/kubelet/pods/a9acbc58-725d-49da-8110-54a476725dbc/volumes" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.351635 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.408642 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle\") pod \"38f80f5b-94aa-4852-a041-427b37320e97\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.408770 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util\") pod \"38f80f5b-94aa-4852-a041-427b37320e97\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.408912 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qdm8\" (UniqueName: \"kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8\") pod \"38f80f5b-94aa-4852-a041-427b37320e97\" (UID: \"38f80f5b-94aa-4852-a041-427b37320e97\") " Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.409967 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle" (OuterVolumeSpecName: "bundle") pod "38f80f5b-94aa-4852-a041-427b37320e97" (UID: "38f80f5b-94aa-4852-a041-427b37320e97"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.416671 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8" (OuterVolumeSpecName: "kube-api-access-7qdm8") pod "38f80f5b-94aa-4852-a041-427b37320e97" (UID: "38f80f5b-94aa-4852-a041-427b37320e97"). InnerVolumeSpecName "kube-api-access-7qdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.429712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util" (OuterVolumeSpecName: "util") pod "38f80f5b-94aa-4852-a041-427b37320e97" (UID: "38f80f5b-94aa-4852-a041-427b37320e97"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.510679 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.510714 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38f80f5b-94aa-4852-a041-427b37320e97-util\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:08 crc kubenswrapper[4744]: I0311 01:10:08.510729 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qdm8\" (UniqueName: \"kubernetes.io/projected/38f80f5b-94aa-4852-a041-427b37320e97-kube-api-access-7qdm8\") on node \"crc\" DevicePath \"\"" Mar 11 01:10:09 crc kubenswrapper[4744]: I0311 01:10:09.045872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" event={"ID":"38f80f5b-94aa-4852-a041-427b37320e97","Type":"ContainerDied","Data":"5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0"} Mar 11 01:10:09 crc kubenswrapper[4744]: I0311 01:10:09.045958 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dead97af3b13b20e7394477ca6b6078cfe5a8eeb65b0f598b95e45ab692fbe0" Mar 11 01:10:09 crc kubenswrapper[4744]: I0311 01:10:09.046092 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4" Mar 11 01:10:12 crc kubenswrapper[4744]: I0311 01:10:12.408839 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:10:12 crc kubenswrapper[4744]: I0311 01:10:12.409221 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.790269 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2"] Mar 11 01:10:18 crc kubenswrapper[4744]: E0311 01:10:18.790999 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="util" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791015 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="util" Mar 11 01:10:18 crc kubenswrapper[4744]: E0311 01:10:18.791024 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerName="console" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791031 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerName="console" Mar 11 01:10:18 crc kubenswrapper[4744]: E0311 01:10:18.791042 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e9671f-6963-4873-8357-2580e9b768f0" containerName="oc" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791049 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e9671f-6963-4873-8357-2580e9b768f0" containerName="oc" Mar 11 01:10:18 crc kubenswrapper[4744]: E0311 01:10:18.791065 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="extract" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791071 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="extract" Mar 11 01:10:18 crc kubenswrapper[4744]: E0311 01:10:18.791080 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="pull" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791086 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="pull" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791188 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f80f5b-94aa-4852-a041-427b37320e97" containerName="extract" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791197 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e9671f-6963-4873-8357-2580e9b768f0" containerName="oc" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791205 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d55c2f3-2eef-42eb-8627-ccd40e21f4d0" containerName="console" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.791583 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.793946 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-w7t5v" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.794240 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.794458 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.794501 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.794647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.807954 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2"] Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.868075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2ck\" (UniqueName: \"kubernetes.io/projected/e20b89e2-b171-4bef-877a-b8670fb99ce4-kube-api-access-8m2ck\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.868144 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-webhook-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.868178 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.969494 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2ck\" (UniqueName: \"kubernetes.io/projected/e20b89e2-b171-4bef-877a-b8670fb99ce4-kube-api-access-8m2ck\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.969864 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-webhook-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.969915 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.976262 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:18 crc kubenswrapper[4744]: I0311 01:10:18.988145 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20b89e2-b171-4bef-877a-b8670fb99ce4-webhook-cert\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.000446 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2ck\" (UniqueName: \"kubernetes.io/projected/e20b89e2-b171-4bef-877a-b8670fb99ce4-kube-api-access-8m2ck\") pod \"metallb-operator-controller-manager-7c4458d67b-6zbc2\" (UID: \"e20b89e2-b171-4bef-877a-b8670fb99ce4\") " pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.101608 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s"] Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.102275 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.104670 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.104823 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9zlcl" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.104770 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.105882 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.115299 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s"] Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.172589 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-webhook-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.172639 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p8s5\" (UniqueName: \"kubernetes.io/projected/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-kube-api-access-9p8s5\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.172706 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-apiservice-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.276552 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-webhook-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.276591 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p8s5\" (UniqueName: \"kubernetes.io/projected/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-kube-api-access-9p8s5\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.276623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-apiservice-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.288111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-webhook-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.288117 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-apiservice-cert\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.291582 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p8s5\" (UniqueName: \"kubernetes.io/projected/743b9bba-abd5-45d5-b1ce-b59c1e7182a6-kube-api-access-9p8s5\") pod \"metallb-operator-webhook-server-76dbb5d8c-qwk8s\" (UID: \"743b9bba-abd5-45d5-b1ce-b59c1e7182a6\") " pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.418722 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.564444 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2"] Mar 11 01:10:19 crc kubenswrapper[4744]: W0311 01:10:19.573145 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode20b89e2_b171_4bef_877a_b8670fb99ce4.slice/crio-0a46117c66250db51fb853365a88caab990a8ba5adcd3b18d363e372b3614e5b WatchSource:0}: Error finding container 0a46117c66250db51fb853365a88caab990a8ba5adcd3b18d363e372b3614e5b: Status 404 returned error can't find the container with id 0a46117c66250db51fb853365a88caab990a8ba5adcd3b18d363e372b3614e5b Mar 11 01:10:19 crc kubenswrapper[4744]: I0311 01:10:19.664116 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s"] Mar 11 01:10:19 crc kubenswrapper[4744]: W0311 01:10:19.670783 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743b9bba_abd5_45d5_b1ce_b59c1e7182a6.slice/crio-eafe5cf6d453d38b922e7733dc4999fa5e176de7b2b30e7444c34c2fb6b3cb86 WatchSource:0}: Error finding container eafe5cf6d453d38b922e7733dc4999fa5e176de7b2b30e7444c34c2fb6b3cb86: Status 404 returned error can't find the container with id eafe5cf6d453d38b922e7733dc4999fa5e176de7b2b30e7444c34c2fb6b3cb86 Mar 11 01:10:20 crc kubenswrapper[4744]: I0311 01:10:20.115873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" event={"ID":"743b9bba-abd5-45d5-b1ce-b59c1e7182a6","Type":"ContainerStarted","Data":"eafe5cf6d453d38b922e7733dc4999fa5e176de7b2b30e7444c34c2fb6b3cb86"} Mar 11 01:10:20 crc kubenswrapper[4744]: I0311 01:10:20.117313 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" event={"ID":"e20b89e2-b171-4bef-877a-b8670fb99ce4","Type":"ContainerStarted","Data":"0a46117c66250db51fb853365a88caab990a8ba5adcd3b18d363e372b3614e5b"} Mar 11 01:10:29 crc kubenswrapper[4744]: I0311 01:10:29.211080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" event={"ID":"743b9bba-abd5-45d5-b1ce-b59c1e7182a6","Type":"ContainerStarted","Data":"897d5fdf7b31f792412af24d58c2e5b141cab535ccee19f06829d5de4561671b"} Mar 11 01:10:29 crc kubenswrapper[4744]: I0311 01:10:29.211537 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:29 crc kubenswrapper[4744]: I0311 01:10:29.229576 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" podStartSLOduration=1.523303826 podStartE2EDuration="10.229558753s" podCreationTimestamp="2026-03-11 01:10:19 +0000 UTC" firstStartedPulling="2026-03-11 01:10:19.68099629 +0000 UTC m=+976.485213905" lastFinishedPulling="2026-03-11 01:10:28.387251227 +0000 UTC m=+985.191468832" observedRunningTime="2026-03-11 01:10:29.228867431 +0000 UTC m=+986.033085036" watchObservedRunningTime="2026-03-11 01:10:29.229558753 +0000 UTC m=+986.033776368" Mar 11 01:10:32 crc kubenswrapper[4744]: I0311 01:10:32.233936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" event={"ID":"e20b89e2-b171-4bef-877a-b8670fb99ce4","Type":"ContainerStarted","Data":"f8d3a709a8fe3b98252a01564580ad2eeb58350f09768ec94ea50a87f86dec75"} Mar 11 01:10:32 crc kubenswrapper[4744]: I0311 01:10:32.234480 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:10:32 crc kubenswrapper[4744]: I0311 01:10:32.280044 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" podStartSLOduration=2.4115088780000002 podStartE2EDuration="14.280000977s" podCreationTimestamp="2026-03-11 01:10:18 +0000 UTC" firstStartedPulling="2026-03-11 01:10:19.57645982 +0000 UTC m=+976.380677435" lastFinishedPulling="2026-03-11 01:10:31.444951899 +0000 UTC m=+988.249169534" observedRunningTime="2026-03-11 01:10:32.274406312 +0000 UTC m=+989.078623927" watchObservedRunningTime="2026-03-11 01:10:32.280000977 +0000 UTC m=+989.084218582" Mar 11 01:10:39 crc kubenswrapper[4744]: I0311 01:10:39.423844 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-76dbb5d8c-qwk8s" Mar 11 01:10:42 crc kubenswrapper[4744]: I0311 01:10:42.410056 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:10:42 crc kubenswrapper[4744]: I0311 01:10:42.410161 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:11:04 crc kubenswrapper[4744]: I0311 01:11:04.666029 4744 scope.go:117] "RemoveContainer" containerID="679ec66311498cc17d79de9a1a18a3ff6aefef33eb1f688889b4239fd05ebe7f" Mar 11 01:11:09 crc kubenswrapper[4744]: I0311 01:11:09.109228 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c4458d67b-6zbc2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.412162 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.413280 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.415814 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6s9zh" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.415852 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.418960 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-65fd2"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.422808 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.425035 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.427876 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.437872 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440127 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-reloader\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440158 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mjb\" (UniqueName: \"kubernetes.io/projected/0cf3eb75-8045-4beb-b5bf-68879b344482-kube-api-access-r7mjb\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440193 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-sockets\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440210 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-conf\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-startup\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440262 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mf9m\" (UniqueName: \"kubernetes.io/projected/21573ca2-d902-4d30-b94a-7b5ae891e084-kube-api-access-8mf9m\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440287 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440316 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics-certs\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.440335 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.505217 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-czr59"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.506024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.507711 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.508094 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.508473 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hsbng" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.508549 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.527156 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-dncxf"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.527997 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.531753 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.541225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics-certs\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.541280 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.541312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.542398 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-dncxf"] Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-reloader\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544805 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mjb\" (UniqueName: \"kubernetes.io/projected/0cf3eb75-8045-4beb-b5bf-68879b344482-kube-api-access-r7mjb\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544828 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-sockets\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544858 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-conf\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99pgc\" (UniqueName: \"kubernetes.io/projected/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-kube-api-access-99pgc\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metrics-certs\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544972 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-startup\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.544996 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mf9m\" (UniqueName: \"kubernetes.io/projected/21573ca2-d902-4d30-b94a-7b5ae891e084-kube-api-access-8mf9m\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.545047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.545062 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metallb-excludel2\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.545323 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-sockets\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.545403 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-reloader\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.545567 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-conf\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.545832 4744 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.545886 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert podName:0cf3eb75-8045-4beb-b5bf-68879b344482 nodeName:}" failed. No retries permitted until 2026-03-11 01:11:11.045868227 +0000 UTC m=+1027.850085932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert") pod "frr-k8s-webhook-server-7f989f654f-pm8pv" (UID: "0cf3eb75-8045-4beb-b5bf-68879b344482") : secret "frr-k8s-webhook-server-cert" not found Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.546701 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/21573ca2-d902-4d30-b94a-7b5ae891e084-frr-startup\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.549452 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21573ca2-d902-4d30-b94a-7b5ae891e084-metrics-certs\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.562332 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mjb\" (UniqueName: \"kubernetes.io/projected/0cf3eb75-8045-4beb-b5bf-68879b344482-kube-api-access-r7mjb\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.567303 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mf9m\" (UniqueName: \"kubernetes.io/projected/21573ca2-d902-4d30-b94a-7b5ae891e084-kube-api-access-8mf9m\") pod \"frr-k8s-65fd2\" (UID: \"21573ca2-d902-4d30-b94a-7b5ae891e084\") " pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646543 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646828 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metallb-excludel2\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646921 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lld9b\" (UniqueName: \"kubernetes.io/projected/6f9ffaa2-fe65-4509-b1a2-4577548128ae-kube-api-access-lld9b\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646949 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99pgc\" (UniqueName: \"kubernetes.io/projected/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-kube-api-access-99pgc\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646965 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metrics-certs\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.646980 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-cert\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.647606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metallb-excludel2\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.647682 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.647717 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist podName:bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49 nodeName:}" failed. No retries permitted until 2026-03-11 01:11:11.147706163 +0000 UTC m=+1027.951923768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist") pod "speaker-czr59" (UID: "bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49") : secret "metallb-memberlist" not found Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.652161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-metrics-certs\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.666282 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99pgc\" (UniqueName: \"kubernetes.io/projected/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-kube-api-access-99pgc\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.735776 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.748442 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lld9b\" (UniqueName: \"kubernetes.io/projected/6f9ffaa2-fe65-4509-b1a2-4577548128ae-kube-api-access-lld9b\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.748531 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-cert\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.748556 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.748680 4744 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 11 01:11:10 crc kubenswrapper[4744]: E0311 01:11:10.748732 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs podName:6f9ffaa2-fe65-4509-b1a2-4577548128ae nodeName:}" failed. No retries permitted until 2026-03-11 01:11:11.248716082 +0000 UTC m=+1028.052933687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs") pod "controller-86ddb6bd46-dncxf" (UID: "6f9ffaa2-fe65-4509-b1a2-4577548128ae") : secret "controller-certs-secret" not found Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.750386 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.763032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-cert\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:10 crc kubenswrapper[4744]: I0311 01:11:10.771275 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lld9b\" (UniqueName: \"kubernetes.io/projected/6f9ffaa2-fe65-4509-b1a2-4577548128ae-kube-api-access-lld9b\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.051725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.055964 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cf3eb75-8045-4beb-b5bf-68879b344482-cert\") pod \"frr-k8s-webhook-server-7f989f654f-pm8pv\" (UID: \"0cf3eb75-8045-4beb-b5bf-68879b344482\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.153336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:11 crc kubenswrapper[4744]: E0311 01:11:11.153460 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 01:11:11 crc kubenswrapper[4744]: E0311 01:11:11.153529 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist podName:bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49 nodeName:}" failed. No retries permitted until 2026-03-11 01:11:12.153497231 +0000 UTC m=+1028.957714836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist") pod "speaker-czr59" (UID: "bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49") : secret "metallb-memberlist" not found Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.255843 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.260121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f9ffaa2-fe65-4509-b1a2-4577548128ae-metrics-certs\") pod \"controller-86ddb6bd46-dncxf\" (UID: \"6f9ffaa2-fe65-4509-b1a2-4577548128ae\") " pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.329454 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.439821 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.538612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"6fd206dcbeac432d1c3a58c58ca74e9e11ea5edfcdddd8b161278fd88a2f3895"} Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.602070 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv"] Mar 11 01:11:11 crc kubenswrapper[4744]: I0311 01:11:11.687122 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-dncxf"] Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.170779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.179697 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49-memberlist\") pod \"speaker-czr59\" (UID: \"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49\") " pod="metallb-system/speaker-czr59" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.319769 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-czr59" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.409106 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.409158 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.409197 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.409734 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.409786 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da" gracePeriod=600 Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.555216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-dncxf" event={"ID":"6f9ffaa2-fe65-4509-b1a2-4577548128ae","Type":"ContainerStarted","Data":"470a6e75027b057da794ee71b5a1e0577a650f889cfeab88e9623de58a96a814"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.558658 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da" exitCode=0 Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561870 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-dncxf" event={"ID":"6f9ffaa2-fe65-4509-b1a2-4577548128ae","Type":"ContainerStarted","Data":"2e9053dbd629cf2276d101d8a0bfd4789dbed81faff5b4bbb1a8f6bd21c87544"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561904 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561918 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-dncxf" event={"ID":"6f9ffaa2-fe65-4509-b1a2-4577548128ae","Type":"ContainerStarted","Data":"7f842165620e6eec45840ddbde9bd72aa343036b7fa6141af7c1c4c34db8ab98"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561928 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czr59" event={"ID":"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49","Type":"ContainerStarted","Data":"f4d0f48a43e9cc8729d1b24ac3b4aeee4910c52a73c5df993c7c12e3bb40bd57"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" event={"ID":"0cf3eb75-8045-4beb-b5bf-68879b344482","Type":"ContainerStarted","Data":"4d256d8facf2e588f80a9d76b068bba25af49d3d40ca5c403e32844df1c8d5f6"} Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.561975 4744 scope.go:117] "RemoveContainer" containerID="fe891bfaabad039e5e00538c290ba658f2b03ec87ceb617b0877366c7d611971" Mar 11 01:11:12 crc kubenswrapper[4744]: I0311 01:11:12.580467 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-dncxf" podStartSLOduration=2.580450559 podStartE2EDuration="2.580450559s" podCreationTimestamp="2026-03-11 01:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:11:12.575120712 +0000 UTC m=+1029.379338317" watchObservedRunningTime="2026-03-11 01:11:12.580450559 +0000 UTC m=+1029.384668164" Mar 11 01:11:13 crc kubenswrapper[4744]: I0311 01:11:13.583634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433"} Mar 11 01:11:13 crc kubenswrapper[4744]: I0311 01:11:13.596585 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czr59" event={"ID":"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49","Type":"ContainerStarted","Data":"770cb6cf727e2cd8a0daf22c8cca37a29824c999d16055f60eb764828bef58d8"} Mar 11 01:11:13 crc kubenswrapper[4744]: I0311 01:11:13.596633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czr59" event={"ID":"bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49","Type":"ContainerStarted","Data":"65ab1818fd0e0b60187d873c655a2f15d9dc2a63713740d5da79ebe551ea4589"} Mar 11 01:11:13 crc kubenswrapper[4744]: I0311 01:11:13.596751 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-czr59" Mar 11 01:11:14 crc kubenswrapper[4744]: I0311 01:11:14.000966 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-czr59" podStartSLOduration=4.000934176 podStartE2EDuration="4.000934176s" podCreationTimestamp="2026-03-11 01:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:11:13.635294185 +0000 UTC m=+1030.439511790" watchObservedRunningTime="2026-03-11 01:11:14.000934176 +0000 UTC m=+1030.805151781" Mar 11 01:11:19 crc kubenswrapper[4744]: I0311 01:11:19.640097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" event={"ID":"0cf3eb75-8045-4beb-b5bf-68879b344482","Type":"ContainerStarted","Data":"d541a9b551df96b4cd8d45e890b24e7e359d4a66f8ff3310b1a5fb09cbb1ddc2"} Mar 11 01:11:19 crc kubenswrapper[4744]: I0311 01:11:19.640773 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:19 crc kubenswrapper[4744]: I0311 01:11:19.642043 4744 generic.go:334] "Generic (PLEG): container finished" podID="21573ca2-d902-4d30-b94a-7b5ae891e084" containerID="da95a4aff59679c7ed433f7caeffb693fc6a629656bf5d96e6d801efd0f1416b" exitCode=0 Mar 11 01:11:19 crc kubenswrapper[4744]: I0311 01:11:19.642318 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerDied","Data":"da95a4aff59679c7ed433f7caeffb693fc6a629656bf5d96e6d801efd0f1416b"} Mar 11 01:11:19 crc kubenswrapper[4744]: I0311 01:11:19.661878 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" podStartSLOduration=2.575594557 podStartE2EDuration="9.66186364s" podCreationTimestamp="2026-03-11 01:11:10 +0000 UTC" firstStartedPulling="2026-03-11 01:11:11.607579547 +0000 UTC m=+1028.411797172" lastFinishedPulling="2026-03-11 01:11:18.69384862 +0000 UTC m=+1035.498066255" observedRunningTime="2026-03-11 01:11:19.659417583 +0000 UTC m=+1036.463635188" watchObservedRunningTime="2026-03-11 01:11:19.66186364 +0000 UTC m=+1036.466081245" Mar 11 01:11:20 crc kubenswrapper[4744]: I0311 01:11:20.652145 4744 generic.go:334] "Generic (PLEG): container finished" podID="21573ca2-d902-4d30-b94a-7b5ae891e084" containerID="bdb015b004c283bd3ab0445dfda2e837599ba2c8c8a146060939e39962f308e3" exitCode=0 Mar 11 01:11:20 crc kubenswrapper[4744]: I0311 01:11:20.652207 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerDied","Data":"bdb015b004c283bd3ab0445dfda2e837599ba2c8c8a146060939e39962f308e3"} Mar 11 01:11:21 crc kubenswrapper[4744]: I0311 01:11:21.447287 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-dncxf" Mar 11 01:11:21 crc kubenswrapper[4744]: I0311 01:11:21.662614 4744 generic.go:334] "Generic (PLEG): container finished" podID="21573ca2-d902-4d30-b94a-7b5ae891e084" containerID="3ed461419d24a5be2955c18805620ac32fd448f4e872e77def2ba86ae9ae54fd" exitCode=0 Mar 11 01:11:21 crc kubenswrapper[4744]: I0311 01:11:21.662673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerDied","Data":"3ed461419d24a5be2955c18805620ac32fd448f4e872e77def2ba86ae9ae54fd"} Mar 11 01:11:22 crc kubenswrapper[4744]: I0311 01:11:22.323797 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-czr59" Mar 11 01:11:22 crc kubenswrapper[4744]: I0311 01:11:22.672018 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"b8bd365dd409c7ea2b434c20b194dab2e4d6b5d31ccd286d98ef5f0e90e6b3ad"} Mar 11 01:11:22 crc kubenswrapper[4744]: I0311 01:11:22.672270 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"ae6de8db2942415ce68a1819465fadb54dc65f5a27278a96e90b53838779afe6"} Mar 11 01:11:22 crc kubenswrapper[4744]: I0311 01:11:22.672281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"fd2df36398a9f63824f79a25843d17c291fdaad50ac00ff750e924ccfc2a5025"} Mar 11 01:11:22 crc kubenswrapper[4744]: I0311 01:11:22.672290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"62798d3eb0717a99f7acaf39a79ea98b93c9b77dcdd9784d6e61409e3707b701"} Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.684277 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"d7b8ba17e8de45d09499f5925dcea4f8435252cb6a22f32db8199c62fbf8e29d"} Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.684329 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-65fd2" event={"ID":"21573ca2-d902-4d30-b94a-7b5ae891e084","Type":"ContainerStarted","Data":"4de0170894a02b943145bfb203b311987a91ee09e044fb3a7876419f882782c9"} Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.686494 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s"] Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.696398 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.700437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.700689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.700822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvjnx\" (UniqueName: \"kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.700499 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.723365 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s"] Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.746221 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-65fd2" podStartSLOduration=5.950512787 podStartE2EDuration="13.746199396s" podCreationTimestamp="2026-03-11 01:11:10 +0000 UTC" firstStartedPulling="2026-03-11 01:11:10.875709746 +0000 UTC m=+1027.679927341" lastFinishedPulling="2026-03-11 01:11:18.671396315 +0000 UTC m=+1035.475613950" observedRunningTime="2026-03-11 01:11:23.740433237 +0000 UTC m=+1040.544650862" watchObservedRunningTime="2026-03-11 01:11:23.746199396 +0000 UTC m=+1040.550417011" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.801645 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.802216 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.802260 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjnx\" (UniqueName: \"kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.802173 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.802897 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:23 crc kubenswrapper[4744]: I0311 01:11:23.827034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvjnx\" (UniqueName: \"kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.052484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.358219 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s"] Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.694355 4744 generic.go:334] "Generic (PLEG): container finished" podID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerID="5003ac2cbf8cc894379d0c67fe65d96574db3761adcfd779be6452c9ff275b93" exitCode=0 Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.696420 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" event={"ID":"807021c7-f538-4f3a-abb8-b5eecaa837b0","Type":"ContainerDied","Data":"5003ac2cbf8cc894379d0c67fe65d96574db3761adcfd779be6452c9ff275b93"} Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.696472 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" event={"ID":"807021c7-f538-4f3a-abb8-b5eecaa837b0","Type":"ContainerStarted","Data":"649a90e410c77e52427d62e63b0ac67b506530c74b992ccadfb2c030f7b3042c"} Mar 11 01:11:24 crc kubenswrapper[4744]: I0311 01:11:24.696501 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:25 crc kubenswrapper[4744]: I0311 01:11:25.736647 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:25 crc kubenswrapper[4744]: I0311 01:11:25.802210 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:28 crc kubenswrapper[4744]: I0311 01:11:28.722456 4744 generic.go:334] "Generic (PLEG): container finished" podID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerID="1f9d4530d9fe8ce49b266d5b6088d31f5a1fcda5ed16369508b48743575e9697" exitCode=0 Mar 11 01:11:28 crc kubenswrapper[4744]: I0311 01:11:28.722560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" event={"ID":"807021c7-f538-4f3a-abb8-b5eecaa837b0","Type":"ContainerDied","Data":"1f9d4530d9fe8ce49b266d5b6088d31f5a1fcda5ed16369508b48743575e9697"} Mar 11 01:11:29 crc kubenswrapper[4744]: I0311 01:11:29.730661 4744 generic.go:334] "Generic (PLEG): container finished" podID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerID="ccd778c0d44993becc616808c01a6624132f4b4a57009ce73a68011453947159" exitCode=0 Mar 11 01:11:29 crc kubenswrapper[4744]: I0311 01:11:29.730703 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" event={"ID":"807021c7-f538-4f3a-abb8-b5eecaa837b0","Type":"ContainerDied","Data":"ccd778c0d44993becc616808c01a6624132f4b4a57009ce73a68011453947159"} Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.078617 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.148860 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvjnx\" (UniqueName: \"kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx\") pod \"807021c7-f538-4f3a-abb8-b5eecaa837b0\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.148940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util\") pod \"807021c7-f538-4f3a-abb8-b5eecaa837b0\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.148976 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle\") pod \"807021c7-f538-4f3a-abb8-b5eecaa837b0\" (UID: \"807021c7-f538-4f3a-abb8-b5eecaa837b0\") " Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.150333 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle" (OuterVolumeSpecName: "bundle") pod "807021c7-f538-4f3a-abb8-b5eecaa837b0" (UID: "807021c7-f538-4f3a-abb8-b5eecaa837b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.161295 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx" (OuterVolumeSpecName: "kube-api-access-lvjnx") pod "807021c7-f538-4f3a-abb8-b5eecaa837b0" (UID: "807021c7-f538-4f3a-abb8-b5eecaa837b0"). InnerVolumeSpecName "kube-api-access-lvjnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.164332 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util" (OuterVolumeSpecName: "util") pod "807021c7-f538-4f3a-abb8-b5eecaa837b0" (UID: "807021c7-f538-4f3a-abb8-b5eecaa837b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.250699 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.251168 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvjnx\" (UniqueName: \"kubernetes.io/projected/807021c7-f538-4f3a-abb8-b5eecaa837b0-kube-api-access-lvjnx\") on node \"crc\" DevicePath \"\"" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.251254 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/807021c7-f538-4f3a-abb8-b5eecaa837b0-util\") on node \"crc\" DevicePath \"\"" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.341462 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-pm8pv" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.763136 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" event={"ID":"807021c7-f538-4f3a-abb8-b5eecaa837b0","Type":"ContainerDied","Data":"649a90e410c77e52427d62e63b0ac67b506530c74b992ccadfb2c030f7b3042c"} Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.763186 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="649a90e410c77e52427d62e63b0ac67b506530c74b992ccadfb2c030f7b3042c" Mar 11 01:11:31 crc kubenswrapper[4744]: I0311 01:11:31.763254 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.368244 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4"] Mar 11 01:11:36 crc kubenswrapper[4744]: E0311 01:11:36.368761 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="extract" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.368775 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="extract" Mar 11 01:11:36 crc kubenswrapper[4744]: E0311 01:11:36.368793 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="util" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.368799 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="util" Mar 11 01:11:36 crc kubenswrapper[4744]: E0311 01:11:36.368811 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="pull" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.368817 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="pull" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.368907 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="807021c7-f538-4f3a-abb8-b5eecaa837b0" containerName="extract" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.369273 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.372495 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.372743 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8448l" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.373608 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.434422 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4"] Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.519137 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/981d522c-23f8-4e93-9332-f9c2ec45d6ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.519240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbgw\" (UniqueName: \"kubernetes.io/projected/981d522c-23f8-4e93-9332-f9c2ec45d6ba-kube-api-access-qvbgw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.620427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbgw\" (UniqueName: \"kubernetes.io/projected/981d522c-23f8-4e93-9332-f9c2ec45d6ba-kube-api-access-qvbgw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.620536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/981d522c-23f8-4e93-9332-f9c2ec45d6ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.621091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/981d522c-23f8-4e93-9332-f9c2ec45d6ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.646858 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbgw\" (UniqueName: \"kubernetes.io/projected/981d522c-23f8-4e93-9332-f9c2ec45d6ba-kube-api-access-qvbgw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-jgqg4\" (UID: \"981d522c-23f8-4e93-9332-f9c2ec45d6ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:36 crc kubenswrapper[4744]: I0311 01:11:36.683396 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" Mar 11 01:11:37 crc kubenswrapper[4744]: I0311 01:11:37.098321 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4"] Mar 11 01:11:37 crc kubenswrapper[4744]: W0311 01:11:37.101493 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981d522c_23f8_4e93_9332_f9c2ec45d6ba.slice/crio-7b0d1acb3f8e5e5221886e2cb6b4490d48d3af89cfaa0d0d42f51e78d26fb67b WatchSource:0}: Error finding container 7b0d1acb3f8e5e5221886e2cb6b4490d48d3af89cfaa0d0d42f51e78d26fb67b: Status 404 returned error can't find the container with id 7b0d1acb3f8e5e5221886e2cb6b4490d48d3af89cfaa0d0d42f51e78d26fb67b Mar 11 01:11:37 crc kubenswrapper[4744]: I0311 01:11:37.809259 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" event={"ID":"981d522c-23f8-4e93-9332-f9c2ec45d6ba","Type":"ContainerStarted","Data":"7b0d1acb3f8e5e5221886e2cb6b4490d48d3af89cfaa0d0d42f51e78d26fb67b"} Mar 11 01:11:40 crc kubenswrapper[4744]: I0311 01:11:40.738925 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-65fd2" Mar 11 01:11:40 crc kubenswrapper[4744]: I0311 01:11:40.826782 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" event={"ID":"981d522c-23f8-4e93-9332-f9c2ec45d6ba","Type":"ContainerStarted","Data":"9335e58fae6d2156b9657c99c636789bd9ef0c0209f34d2fda7ecd50f9981641"} Mar 11 01:11:40 crc kubenswrapper[4744]: I0311 01:11:40.854622 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-jgqg4" podStartSLOduration=2.081873972 podStartE2EDuration="4.854606804s" podCreationTimestamp="2026-03-11 01:11:36 +0000 UTC" firstStartedPulling="2026-03-11 01:11:37.10474743 +0000 UTC m=+1053.908965055" lastFinishedPulling="2026-03-11 01:11:39.877480282 +0000 UTC m=+1056.681697887" observedRunningTime="2026-03-11 01:11:40.853449168 +0000 UTC m=+1057.657666813" watchObservedRunningTime="2026-03-11 01:11:40.854606804 +0000 UTC m=+1057.658824409" Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.875357 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wk9vl"] Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.876566 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.878244 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.878684 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.879705 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zv666" Mar 11 01:11:43 crc kubenswrapper[4744]: I0311 01:11:43.887177 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wk9vl"] Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.047341 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.047398 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8sf\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-kube-api-access-mh8sf\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.149098 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.149157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh8sf\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-kube-api-access-mh8sf\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.171379 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.179823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh8sf\" (UniqueName: \"kubernetes.io/projected/68995fde-1ad0-4641-8cb3-2af8f1117cfd-kube-api-access-mh8sf\") pod \"cert-manager-webhook-6888856db4-wk9vl\" (UID: \"68995fde-1ad0-4641-8cb3-2af8f1117cfd\") " pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.198351 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.688841 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wk9vl"] Mar 11 01:11:44 crc kubenswrapper[4744]: W0311 01:11:44.734272 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68995fde_1ad0_4641_8cb3_2af8f1117cfd.slice/crio-f196a64ebc55b340af13e936269102bba37e1bf8e7b3e0413dab3901285c5cb4 WatchSource:0}: Error finding container f196a64ebc55b340af13e936269102bba37e1bf8e7b3e0413dab3901285c5cb4: Status 404 returned error can't find the container with id f196a64ebc55b340af13e936269102bba37e1bf8e7b3e0413dab3901285c5cb4 Mar 11 01:11:44 crc kubenswrapper[4744]: I0311 01:11:44.856277 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" event={"ID":"68995fde-1ad0-4641-8cb3-2af8f1117cfd","Type":"ContainerStarted","Data":"f196a64ebc55b340af13e936269102bba37e1bf8e7b3e0413dab3901285c5cb4"} Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.530500 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-gdbpx"] Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.532707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.544769 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2zxkw" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.549084 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-gdbpx"] Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.668701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzzc\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-kube-api-access-5fzzc\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.668836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.769991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzzc\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-kube-api-access-5fzzc\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.770142 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.798368 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzzc\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-kube-api-access-5fzzc\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.798880 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9496814f-7ec3-4763-a421-1a050c4b1ff5-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-gdbpx\" (UID: \"9496814f-7ec3-4763-a421-1a050c4b1ff5\") " pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:45 crc kubenswrapper[4744]: I0311 01:11:45.853269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" Mar 11 01:11:46 crc kubenswrapper[4744]: I0311 01:11:46.406208 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-gdbpx"] Mar 11 01:11:46 crc kubenswrapper[4744]: W0311 01:11:46.413463 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9496814f_7ec3_4763_a421_1a050c4b1ff5.slice/crio-901df26d7a00ea982151a4051a31a1de4039c63fa5ba99238fff2b312e5815a6 WatchSource:0}: Error finding container 901df26d7a00ea982151a4051a31a1de4039c63fa5ba99238fff2b312e5815a6: Status 404 returned error can't find the container with id 901df26d7a00ea982151a4051a31a1de4039c63fa5ba99238fff2b312e5815a6 Mar 11 01:11:46 crc kubenswrapper[4744]: I0311 01:11:46.881231 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" event={"ID":"9496814f-7ec3-4763-a421-1a050c4b1ff5","Type":"ContainerStarted","Data":"901df26d7a00ea982151a4051a31a1de4039c63fa5ba99238fff2b312e5815a6"} Mar 11 01:11:49 crc kubenswrapper[4744]: I0311 01:11:49.900215 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" event={"ID":"68995fde-1ad0-4641-8cb3-2af8f1117cfd","Type":"ContainerStarted","Data":"acc5d16e6334fdb6b97d44427bc01d0631418b03c38af8306fac82fa6d8103c6"} Mar 11 01:11:49 crc kubenswrapper[4744]: I0311 01:11:49.900808 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:11:49 crc kubenswrapper[4744]: I0311 01:11:49.925541 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" podStartSLOduration=2.573015363 podStartE2EDuration="6.925496716s" podCreationTimestamp="2026-03-11 01:11:43 +0000 UTC" firstStartedPulling="2026-03-11 01:11:44.749142077 +0000 UTC m=+1061.553359702" lastFinishedPulling="2026-03-11 01:11:49.10162345 +0000 UTC m=+1065.905841055" observedRunningTime="2026-03-11 01:11:49.923289598 +0000 UTC m=+1066.727507243" watchObservedRunningTime="2026-03-11 01:11:49.925496716 +0000 UTC m=+1066.729714361" Mar 11 01:11:52 crc kubenswrapper[4744]: I0311 01:11:52.934328 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" event={"ID":"9496814f-7ec3-4763-a421-1a050c4b1ff5","Type":"ContainerStarted","Data":"fbf49f2512ed3f459587a3ceaa204b36d2dca8fa0cb9f865e4c0985befa3664f"} Mar 11 01:11:52 crc kubenswrapper[4744]: I0311 01:11:52.966685 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-gdbpx" podStartSLOduration=2.455517468 podStartE2EDuration="7.966655377s" podCreationTimestamp="2026-03-11 01:11:45 +0000 UTC" firstStartedPulling="2026-03-11 01:11:46.415701162 +0000 UTC m=+1063.219918757" lastFinishedPulling="2026-03-11 01:11:51.926839021 +0000 UTC m=+1068.731056666" observedRunningTime="2026-03-11 01:11:52.955155051 +0000 UTC m=+1069.759372716" watchObservedRunningTime="2026-03-11 01:11:52.966655377 +0000 UTC m=+1069.770873012" Mar 11 01:11:54 crc kubenswrapper[4744]: I0311 01:11:54.203805 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-wk9vl" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.141345 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553192-ggv6v"] Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.142752 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.147645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.149038 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.150415 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.154446 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553192-ggv6v"] Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.295437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwz86\" (UniqueName: \"kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86\") pod \"auto-csr-approver-29553192-ggv6v\" (UID: \"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59\") " pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.397337 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwz86\" (UniqueName: \"kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86\") pod \"auto-csr-approver-29553192-ggv6v\" (UID: \"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59\") " pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.427772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwz86\" (UniqueName: \"kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86\") pod \"auto-csr-approver-29553192-ggv6v\" (UID: \"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59\") " pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.499642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.754186 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553192-ggv6v"] Mar 11 01:12:00 crc kubenswrapper[4744]: I0311 01:12:00.990111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" event={"ID":"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59","Type":"ContainerStarted","Data":"16cfeb0ce6f06d07c5bcf48736c9cf575a21ba221d651fd17f08fc2f60e007a6"} Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.836936 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dkcx"] Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.838280 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.841355 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9qp5c" Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.847246 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dkcx"] Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.922064 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-bound-sa-token\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:01 crc kubenswrapper[4744]: I0311 01:12:01.922120 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9f2\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-kube-api-access-zr9f2\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.002271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" event={"ID":"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59","Type":"ContainerStarted","Data":"9c6dc98a464ea9f49c43579ba760e067869199edfe8854675ae6108fd84e037d"} Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.023719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-bound-sa-token\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.024168 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9f2\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-kube-api-access-zr9f2\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.032559 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" podStartSLOduration=1.206020186 podStartE2EDuration="2.032538775s" podCreationTimestamp="2026-03-11 01:12:00 +0000 UTC" firstStartedPulling="2026-03-11 01:12:00.768149019 +0000 UTC m=+1077.572366634" lastFinishedPulling="2026-03-11 01:12:01.594667578 +0000 UTC m=+1078.398885223" observedRunningTime="2026-03-11 01:12:02.02430516 +0000 UTC m=+1078.828522795" watchObservedRunningTime="2026-03-11 01:12:02.032538775 +0000 UTC m=+1078.836756410" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.057829 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9f2\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-kube-api-access-zr9f2\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.059345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/351f588e-9ef6-498a-a322-b2f00dad1d35-bound-sa-token\") pod \"cert-manager-545d4d4674-9dkcx\" (UID: \"351f588e-9ef6-498a-a322-b2f00dad1d35\") " pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.184813 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dkcx" Mar 11 01:12:02 crc kubenswrapper[4744]: I0311 01:12:02.658545 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dkcx"] Mar 11 01:12:02 crc kubenswrapper[4744]: W0311 01:12:02.665693 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351f588e_9ef6_498a_a322_b2f00dad1d35.slice/crio-e1087bfe554b2bec51fdd6b36e94ef9d1ab0ca25c296e573b6178ec177193789 WatchSource:0}: Error finding container e1087bfe554b2bec51fdd6b36e94ef9d1ab0ca25c296e573b6178ec177193789: Status 404 returned error can't find the container with id e1087bfe554b2bec51fdd6b36e94ef9d1ab0ca25c296e573b6178ec177193789 Mar 11 01:12:03 crc kubenswrapper[4744]: I0311 01:12:03.013351 4744 generic.go:334] "Generic (PLEG): container finished" podID="adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" containerID="9c6dc98a464ea9f49c43579ba760e067869199edfe8854675ae6108fd84e037d" exitCode=0 Mar 11 01:12:03 crc kubenswrapper[4744]: I0311 01:12:03.013441 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" event={"ID":"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59","Type":"ContainerDied","Data":"9c6dc98a464ea9f49c43579ba760e067869199edfe8854675ae6108fd84e037d"} Mar 11 01:12:03 crc kubenswrapper[4744]: I0311 01:12:03.015989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dkcx" event={"ID":"351f588e-9ef6-498a-a322-b2f00dad1d35","Type":"ContainerStarted","Data":"08e2a6e60b1056c141420be3294f2821edd8f2960658f6b460f0a3952aa9398a"} Mar 11 01:12:03 crc kubenswrapper[4744]: I0311 01:12:03.016042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dkcx" event={"ID":"351f588e-9ef6-498a-a322-b2f00dad1d35","Type":"ContainerStarted","Data":"e1087bfe554b2bec51fdd6b36e94ef9d1ab0ca25c296e573b6178ec177193789"} Mar 11 01:12:03 crc kubenswrapper[4744]: I0311 01:12:03.056348 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9dkcx" podStartSLOduration=2.056328765 podStartE2EDuration="2.056328765s" podCreationTimestamp="2026-03-11 01:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:12:03.050275778 +0000 UTC m=+1079.854493393" watchObservedRunningTime="2026-03-11 01:12:03.056328765 +0000 UTC m=+1079.860546380" Mar 11 01:12:04 crc kubenswrapper[4744]: I0311 01:12:04.346839 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:04 crc kubenswrapper[4744]: I0311 01:12:04.464942 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwz86\" (UniqueName: \"kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86\") pod \"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59\" (UID: \"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59\") " Mar 11 01:12:04 crc kubenswrapper[4744]: I0311 01:12:04.472695 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86" (OuterVolumeSpecName: "kube-api-access-bwz86") pod "adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" (UID: "adfdc5e7-2770-4cdb-ad17-194d0ca0fa59"). InnerVolumeSpecName "kube-api-access-bwz86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:12:04 crc kubenswrapper[4744]: I0311 01:12:04.567218 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwz86\" (UniqueName: \"kubernetes.io/projected/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59-kube-api-access-bwz86\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.043479 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" event={"ID":"adfdc5e7-2770-4cdb-ad17-194d0ca0fa59","Type":"ContainerDied","Data":"16cfeb0ce6f06d07c5bcf48736c9cf575a21ba221d651fd17f08fc2f60e007a6"} Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.043599 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cfeb0ce6f06d07c5bcf48736c9cf575a21ba221d651fd17f08fc2f60e007a6" Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.043599 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553192-ggv6v" Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.089255 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553186-q5zpz"] Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.096755 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553186-q5zpz"] Mar 11 01:12:05 crc kubenswrapper[4744]: I0311 01:12:05.985003 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d8650d-fb33-4ea1-8433-cd108c110664" path="/var/lib/kubelet/pods/f0d8650d-fb33-4ea1-8433-cd108c110664/volumes" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.770733 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:08 crc kubenswrapper[4744]: E0311 01:12:08.777285 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" containerName="oc" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.777325 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" containerName="oc" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.777582 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" containerName="oc" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.778085 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.779898 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.780466 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bfngk" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.782401 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.783527 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:08 crc kubenswrapper[4744]: I0311 01:12:08.928917 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9c5\" (UniqueName: \"kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5\") pod \"openstack-operator-index-jg9ct\" (UID: \"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd\") " pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:09 crc kubenswrapper[4744]: I0311 01:12:09.030677 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9c5\" (UniqueName: \"kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5\") pod \"openstack-operator-index-jg9ct\" (UID: \"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd\") " pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:09 crc kubenswrapper[4744]: I0311 01:12:09.062631 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9c5\" (UniqueName: \"kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5\") pod \"openstack-operator-index-jg9ct\" (UID: \"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd\") " pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:09 crc kubenswrapper[4744]: I0311 01:12:09.146353 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:09 crc kubenswrapper[4744]: I0311 01:12:09.405434 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:09 crc kubenswrapper[4744]: W0311 01:12:09.415872 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b7139a_f2c6_4e66_b9c3_df734f3f71fd.slice/crio-3b7067f4de056ce08d67e37259a797ba535046caed143783ed0315f319ae1089 WatchSource:0}: Error finding container 3b7067f4de056ce08d67e37259a797ba535046caed143783ed0315f319ae1089: Status 404 returned error can't find the container with id 3b7067f4de056ce08d67e37259a797ba535046caed143783ed0315f319ae1089 Mar 11 01:12:10 crc kubenswrapper[4744]: I0311 01:12:10.084689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jg9ct" event={"ID":"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd","Type":"ContainerStarted","Data":"3b7067f4de056ce08d67e37259a797ba535046caed143783ed0315f319ae1089"} Mar 11 01:12:11 crc kubenswrapper[4744]: I0311 01:12:11.095902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jg9ct" event={"ID":"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd","Type":"ContainerStarted","Data":"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5"} Mar 11 01:12:11 crc kubenswrapper[4744]: I0311 01:12:11.129276 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jg9ct" podStartSLOduration=2.278277244 podStartE2EDuration="3.1292449s" podCreationTimestamp="2026-03-11 01:12:08 +0000 UTC" firstStartedPulling="2026-03-11 01:12:09.419604538 +0000 UTC m=+1086.223822153" lastFinishedPulling="2026-03-11 01:12:10.270572164 +0000 UTC m=+1087.074789809" observedRunningTime="2026-03-11 01:12:11.116649368 +0000 UTC m=+1087.920867093" watchObservedRunningTime="2026-03-11 01:12:11.1292449 +0000 UTC m=+1087.933462535" Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.139738 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.741860 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p7t5b"] Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.744355 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.769261 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p7t5b"] Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.796136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj76p\" (UniqueName: \"kubernetes.io/projected/04455556-98d1-4461-945a-9a5b74b6508f-kube-api-access-zj76p\") pod \"openstack-operator-index-p7t5b\" (UID: \"04455556-98d1-4461-945a-9a5b74b6508f\") " pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.898022 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj76p\" (UniqueName: \"kubernetes.io/projected/04455556-98d1-4461-945a-9a5b74b6508f-kube-api-access-zj76p\") pod \"openstack-operator-index-p7t5b\" (UID: \"04455556-98d1-4461-945a-9a5b74b6508f\") " pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:12 crc kubenswrapper[4744]: I0311 01:12:12.925292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj76p\" (UniqueName: \"kubernetes.io/projected/04455556-98d1-4461-945a-9a5b74b6508f-kube-api-access-zj76p\") pod \"openstack-operator-index-p7t5b\" (UID: \"04455556-98d1-4461-945a-9a5b74b6508f\") " pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.080507 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.117072 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jg9ct" podUID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" containerName="registry-server" containerID="cri-o://9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5" gracePeriod=2 Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.557071 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p7t5b"] Mar 11 01:12:13 crc kubenswrapper[4744]: W0311 01:12:13.563697 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04455556_98d1_4461_945a_9a5b74b6508f.slice/crio-92654550573c2de5648639f1f2a8eb407d3bce7627ac67d627439a0e0803c511 WatchSource:0}: Error finding container 92654550573c2de5648639f1f2a8eb407d3bce7627ac67d627439a0e0803c511: Status 404 returned error can't find the container with id 92654550573c2de5648639f1f2a8eb407d3bce7627ac67d627439a0e0803c511 Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.601349 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.752075 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l9c5\" (UniqueName: \"kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5\") pod \"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd\" (UID: \"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd\") " Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.759850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5" (OuterVolumeSpecName: "kube-api-access-8l9c5") pod "d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" (UID: "d9b7139a-f2c6-4e66-b9c3-df734f3f71fd"). InnerVolumeSpecName "kube-api-access-8l9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:12:13 crc kubenswrapper[4744]: I0311 01:12:13.854129 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l9c5\" (UniqueName: \"kubernetes.io/projected/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd-kube-api-access-8l9c5\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.125163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p7t5b" event={"ID":"04455556-98d1-4461-945a-9a5b74b6508f","Type":"ContainerStarted","Data":"92654550573c2de5648639f1f2a8eb407d3bce7627ac67d627439a0e0803c511"} Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.127024 4744 generic.go:334] "Generic (PLEG): container finished" podID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" containerID="9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5" exitCode=0 Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.127075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jg9ct" event={"ID":"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd","Type":"ContainerDied","Data":"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5"} Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.127087 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jg9ct" Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.127106 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jg9ct" event={"ID":"d9b7139a-f2c6-4e66-b9c3-df734f3f71fd","Type":"ContainerDied","Data":"3b7067f4de056ce08d67e37259a797ba535046caed143783ed0315f319ae1089"} Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.127127 4744 scope.go:117] "RemoveContainer" containerID="9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5" Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.150326 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.151067 4744 scope.go:117] "RemoveContainer" containerID="9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5" Mar 11 01:12:14 crc kubenswrapper[4744]: E0311 01:12:14.151546 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5\": container with ID starting with 9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5 not found: ID does not exist" containerID="9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5" Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.151590 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5"} err="failed to get container status \"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5\": rpc error: code = NotFound desc = could not find container \"9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5\": container with ID starting with 9e94df3e7d1dd1c3522f20797e3b09f8d4fe9c20a895fd1cee93392a348a14e5 not found: ID does not exist" Mar 11 01:12:14 crc kubenswrapper[4744]: I0311 01:12:14.156198 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jg9ct"] Mar 11 01:12:15 crc kubenswrapper[4744]: I0311 01:12:15.136466 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p7t5b" event={"ID":"04455556-98d1-4461-945a-9a5b74b6508f","Type":"ContainerStarted","Data":"9ce7b40f61dd042155918a2cc5d0d4b89b3e1f4b64d78d8bee9a120b488cef6f"} Mar 11 01:12:15 crc kubenswrapper[4744]: I0311 01:12:15.165535 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p7t5b" podStartSLOduration=2.638412834 podStartE2EDuration="3.165505111s" podCreationTimestamp="2026-03-11 01:12:12 +0000 UTC" firstStartedPulling="2026-03-11 01:12:13.567195103 +0000 UTC m=+1090.371412708" lastFinishedPulling="2026-03-11 01:12:14.09428738 +0000 UTC m=+1090.898504985" observedRunningTime="2026-03-11 01:12:15.163352294 +0000 UTC m=+1091.967569929" watchObservedRunningTime="2026-03-11 01:12:15.165505111 +0000 UTC m=+1091.969722706" Mar 11 01:12:15 crc kubenswrapper[4744]: I0311 01:12:15.986832 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" path="/var/lib/kubelet/pods/d9b7139a-f2c6-4e66-b9c3-df734f3f71fd/volumes" Mar 11 01:12:23 crc kubenswrapper[4744]: I0311 01:12:23.081438 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:23 crc kubenswrapper[4744]: I0311 01:12:23.082212 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:23 crc kubenswrapper[4744]: I0311 01:12:23.128824 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:23 crc kubenswrapper[4744]: I0311 01:12:23.237972 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p7t5b" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.181764 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7"] Mar 11 01:12:25 crc kubenswrapper[4744]: E0311 01:12:25.183002 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" containerName="registry-server" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.183031 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" containerName="registry-server" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.183140 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b7139a-f2c6-4e66-b9c3-df734f3f71fd" containerName="registry-server" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.183956 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.191897 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5ck8f" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.195965 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7"] Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.329106 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmkt\" (UniqueName: \"kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.329196 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.329418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.431810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxmkt\" (UniqueName: \"kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.431910 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.431993 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.432665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.432759 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.481776 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxmkt\" (UniqueName: \"kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.503980 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:25 crc kubenswrapper[4744]: I0311 01:12:25.786036 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7"] Mar 11 01:12:25 crc kubenswrapper[4744]: W0311 01:12:25.798897 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9ee3f9_00c2_4838_b6e8_acc23ff0ffe9.slice/crio-d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6 WatchSource:0}: Error finding container d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6: Status 404 returned error can't find the container with id d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6 Mar 11 01:12:26 crc kubenswrapper[4744]: I0311 01:12:26.249843 4744 generic.go:334] "Generic (PLEG): container finished" podID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerID="49ce01fdb94d3f3d0730d570251aa0d97a7e666077b08da218ae78899da2e766" exitCode=0 Mar 11 01:12:26 crc kubenswrapper[4744]: I0311 01:12:26.249932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerDied","Data":"49ce01fdb94d3f3d0730d570251aa0d97a7e666077b08da218ae78899da2e766"} Mar 11 01:12:26 crc kubenswrapper[4744]: I0311 01:12:26.250358 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerStarted","Data":"d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6"} Mar 11 01:12:27 crc kubenswrapper[4744]: I0311 01:12:27.262193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerStarted","Data":"ef0326ebd5e3a235be066e694823c14afb6290889027b3b72ffb0bc0fe41ac6e"} Mar 11 01:12:28 crc kubenswrapper[4744]: I0311 01:12:28.272963 4744 generic.go:334] "Generic (PLEG): container finished" podID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerID="ef0326ebd5e3a235be066e694823c14afb6290889027b3b72ffb0bc0fe41ac6e" exitCode=0 Mar 11 01:12:28 crc kubenswrapper[4744]: I0311 01:12:28.273383 4744 generic.go:334] "Generic (PLEG): container finished" podID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerID="263e6981e3ec008eefcf48c2a92a4bade29a7979417c42fdee66a9195f979271" exitCode=0 Mar 11 01:12:28 crc kubenswrapper[4744]: I0311 01:12:28.273065 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerDied","Data":"ef0326ebd5e3a235be066e694823c14afb6290889027b3b72ffb0bc0fe41ac6e"} Mar 11 01:12:28 crc kubenswrapper[4744]: I0311 01:12:28.273441 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerDied","Data":"263e6981e3ec008eefcf48c2a92a4bade29a7979417c42fdee66a9195f979271"} Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.592715 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.702591 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util\") pod \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.702793 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxmkt\" (UniqueName: \"kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt\") pod \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.702980 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle\") pod \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\" (UID: \"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9\") " Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.704433 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle" (OuterVolumeSpecName: "bundle") pod "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" (UID: "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.713130 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt" (OuterVolumeSpecName: "kube-api-access-qxmkt") pod "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" (UID: "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9"). InnerVolumeSpecName "kube-api-access-qxmkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.736061 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util" (OuterVolumeSpecName: "util") pod "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" (UID: "1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.805493 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxmkt\" (UniqueName: \"kubernetes.io/projected/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-kube-api-access-qxmkt\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.805569 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:29 crc kubenswrapper[4744]: I0311 01:12:29.805618 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9-util\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.150383 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:30 crc kubenswrapper[4744]: E0311 01:12:30.150878 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="pull" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.150916 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="pull" Mar 11 01:12:30 crc kubenswrapper[4744]: E0311 01:12:30.150948 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="extract" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.150962 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="extract" Mar 11 01:12:30 crc kubenswrapper[4744]: E0311 01:12:30.151004 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="util" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.151016 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="util" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.151204 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9" containerName="extract" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.153109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.166880 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.290274 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" event={"ID":"1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9","Type":"ContainerDied","Data":"d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6"} Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.290331 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d782425d44ca24431db61f277faec5a8a89872324d45d103db1cf9fc06dde7a6" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.290403 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.314214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b254p\" (UniqueName: \"kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.314309 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.314547 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.415919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.416028 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b254p\" (UniqueName: \"kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.416069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.416540 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.416563 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.433485 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b254p\" (UniqueName: \"kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p\") pod \"community-operators-fnhns\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.481327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:30 crc kubenswrapper[4744]: I0311 01:12:30.957527 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:31 crc kubenswrapper[4744]: I0311 01:12:31.300765 4744 generic.go:334] "Generic (PLEG): container finished" podID="d5a080e6-628b-4332-8f97-8141a43874e5" containerID="ff57ed45d6ba79cff0568192e33e2ee0b0ed9918f48f29787630c895cf776d19" exitCode=0 Mar 11 01:12:31 crc kubenswrapper[4744]: I0311 01:12:31.300908 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerDied","Data":"ff57ed45d6ba79cff0568192e33e2ee0b0ed9918f48f29787630c895cf776d19"} Mar 11 01:12:31 crc kubenswrapper[4744]: I0311 01:12:31.301131 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerStarted","Data":"9019f499c9d90ae9e2e29fe113d42509ceae4fd101411394b86e97bdb4620853"} Mar 11 01:12:32 crc kubenswrapper[4744]: I0311 01:12:32.314162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerStarted","Data":"9327a15fc427501ae0e6a01a3e3bf71aca531781324d02bf3e1bce85f57b516a"} Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.322931 4744 generic.go:334] "Generic (PLEG): container finished" podID="d5a080e6-628b-4332-8f97-8141a43874e5" containerID="9327a15fc427501ae0e6a01a3e3bf71aca531781324d02bf3e1bce85f57b516a" exitCode=0 Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.323056 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerDied","Data":"9327a15fc427501ae0e6a01a3e3bf71aca531781324d02bf3e1bce85f57b516a"} Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.702009 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x"] Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.703008 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.711440 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-njm8n" Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.773394 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x"] Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.869913 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmt2b\" (UniqueName: \"kubernetes.io/projected/d69a58f4-054d-456c-9ea1-c68b893dadb4-kube-api-access-bmt2b\") pod \"openstack-operator-controller-init-6cf8df7788-8965x\" (UID: \"d69a58f4-054d-456c-9ea1-c68b893dadb4\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.971066 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmt2b\" (UniqueName: \"kubernetes.io/projected/d69a58f4-054d-456c-9ea1-c68b893dadb4-kube-api-access-bmt2b\") pod \"openstack-operator-controller-init-6cf8df7788-8965x\" (UID: \"d69a58f4-054d-456c-9ea1-c68b893dadb4\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:33 crc kubenswrapper[4744]: I0311 01:12:33.993606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmt2b\" (UniqueName: \"kubernetes.io/projected/d69a58f4-054d-456c-9ea1-c68b893dadb4-kube-api-access-bmt2b\") pod \"openstack-operator-controller-init-6cf8df7788-8965x\" (UID: \"d69a58f4-054d-456c-9ea1-c68b893dadb4\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:34 crc kubenswrapper[4744]: I0311 01:12:34.025593 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:34 crc kubenswrapper[4744]: I0311 01:12:34.349964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerStarted","Data":"b522b37266f9466710c20a6594725bc1a2c9819dccf4b281c9c6c9796a21ff0b"} Mar 11 01:12:34 crc kubenswrapper[4744]: I0311 01:12:34.365826 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fnhns" podStartSLOduration=1.944798397 podStartE2EDuration="4.365810024s" podCreationTimestamp="2026-03-11 01:12:30 +0000 UTC" firstStartedPulling="2026-03-11 01:12:31.302851795 +0000 UTC m=+1108.107069440" lastFinishedPulling="2026-03-11 01:12:33.723863462 +0000 UTC m=+1110.528081067" observedRunningTime="2026-03-11 01:12:34.365191524 +0000 UTC m=+1111.169409149" watchObservedRunningTime="2026-03-11 01:12:34.365810024 +0000 UTC m=+1111.170027629" Mar 11 01:12:34 crc kubenswrapper[4744]: I0311 01:12:34.513464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x"] Mar 11 01:12:34 crc kubenswrapper[4744]: W0311 01:12:34.519978 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69a58f4_054d_456c_9ea1_c68b893dadb4.slice/crio-af873435b571b7b7d187ff384be58ef1bb9556610a937bfe72f27afbd68b3293 WatchSource:0}: Error finding container af873435b571b7b7d187ff384be58ef1bb9556610a937bfe72f27afbd68b3293: Status 404 returned error can't find the container with id af873435b571b7b7d187ff384be58ef1bb9556610a937bfe72f27afbd68b3293 Mar 11 01:12:35 crc kubenswrapper[4744]: I0311 01:12:35.357066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" event={"ID":"d69a58f4-054d-456c-9ea1-c68b893dadb4","Type":"ContainerStarted","Data":"af873435b571b7b7d187ff384be58ef1bb9556610a937bfe72f27afbd68b3293"} Mar 11 01:12:39 crc kubenswrapper[4744]: I0311 01:12:39.417072 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" event={"ID":"d69a58f4-054d-456c-9ea1-c68b893dadb4","Type":"ContainerStarted","Data":"d40fda6ba5471a0515757047ffeeb002e7a6dec9d2667ae6de0d8323d4ba6b9c"} Mar 11 01:12:39 crc kubenswrapper[4744]: I0311 01:12:39.418850 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:39 crc kubenswrapper[4744]: I0311 01:12:39.466209 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" podStartSLOduration=2.110818183 podStartE2EDuration="6.466185426s" podCreationTimestamp="2026-03-11 01:12:33 +0000 UTC" firstStartedPulling="2026-03-11 01:12:34.522072943 +0000 UTC m=+1111.326290548" lastFinishedPulling="2026-03-11 01:12:38.877440186 +0000 UTC m=+1115.681657791" observedRunningTime="2026-03-11 01:12:39.461751609 +0000 UTC m=+1116.265969224" watchObservedRunningTime="2026-03-11 01:12:39.466185426 +0000 UTC m=+1116.270403051" Mar 11 01:12:40 crc kubenswrapper[4744]: I0311 01:12:40.482474 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:40 crc kubenswrapper[4744]: I0311 01:12:40.482885 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:40 crc kubenswrapper[4744]: I0311 01:12:40.550111 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:41 crc kubenswrapper[4744]: I0311 01:12:41.494981 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:42 crc kubenswrapper[4744]: I0311 01:12:42.733713 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:43 crc kubenswrapper[4744]: I0311 01:12:43.452053 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fnhns" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="registry-server" containerID="cri-o://b522b37266f9466710c20a6594725bc1a2c9819dccf4b281c9c6c9796a21ff0b" gracePeriod=2 Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.029829 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-8965x" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.462117 4744 generic.go:334] "Generic (PLEG): container finished" podID="d5a080e6-628b-4332-8f97-8141a43874e5" containerID="b522b37266f9466710c20a6594725bc1a2c9819dccf4b281c9c6c9796a21ff0b" exitCode=0 Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.462184 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerDied","Data":"b522b37266f9466710c20a6594725bc1a2c9819dccf4b281c9c6c9796a21ff0b"} Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.551880 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.630963 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b254p\" (UniqueName: \"kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p\") pod \"d5a080e6-628b-4332-8f97-8141a43874e5\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.631015 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content\") pod \"d5a080e6-628b-4332-8f97-8141a43874e5\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.631066 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities\") pod \"d5a080e6-628b-4332-8f97-8141a43874e5\" (UID: \"d5a080e6-628b-4332-8f97-8141a43874e5\") " Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.631976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities" (OuterVolumeSpecName: "utilities") pod "d5a080e6-628b-4332-8f97-8141a43874e5" (UID: "d5a080e6-628b-4332-8f97-8141a43874e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.635662 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p" (OuterVolumeSpecName: "kube-api-access-b254p") pod "d5a080e6-628b-4332-8f97-8141a43874e5" (UID: "d5a080e6-628b-4332-8f97-8141a43874e5"). InnerVolumeSpecName "kube-api-access-b254p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.691913 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5a080e6-628b-4332-8f97-8141a43874e5" (UID: "d5a080e6-628b-4332-8f97-8141a43874e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.731905 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.731931 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b254p\" (UniqueName: \"kubernetes.io/projected/d5a080e6-628b-4332-8f97-8141a43874e5-kube-api-access-b254p\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:44 crc kubenswrapper[4744]: I0311 01:12:44.731940 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a080e6-628b-4332-8f97-8141a43874e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.471016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnhns" event={"ID":"d5a080e6-628b-4332-8f97-8141a43874e5","Type":"ContainerDied","Data":"9019f499c9d90ae9e2e29fe113d42509ceae4fd101411394b86e97bdb4620853"} Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.471070 4744 scope.go:117] "RemoveContainer" containerID="b522b37266f9466710c20a6594725bc1a2c9819dccf4b281c9c6c9796a21ff0b" Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.471136 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnhns" Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.501249 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.505710 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fnhns"] Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.509259 4744 scope.go:117] "RemoveContainer" containerID="9327a15fc427501ae0e6a01a3e3bf71aca531781324d02bf3e1bce85f57b516a" Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.534174 4744 scope.go:117] "RemoveContainer" containerID="ff57ed45d6ba79cff0568192e33e2ee0b0ed9918f48f29787630c895cf776d19" Mar 11 01:12:45 crc kubenswrapper[4744]: I0311 01:12:45.994189 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" path="/var/lib/kubelet/pods/d5a080e6-628b-4332-8f97-8141a43874e5/volumes" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.266863 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc"] Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.267617 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="extract-utilities" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.267630 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="extract-utilities" Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.267649 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="registry-server" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.267655 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="registry-server" Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.267676 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="extract-content" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.267683 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="extract-content" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.267776 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a080e6-628b-4332-8f97-8141a43874e5" containerName="registry-server" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.268152 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.285829 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-b8vw8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.288307 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.289267 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.292674 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.293490 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.296478 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-s4sbs" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.296850 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rkhmd" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.301125 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.315140 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.325009 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.325790 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.327562 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8bkst" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.346644 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.353462 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.367472 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.368286 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.375354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqj2\" (UniqueName: \"kubernetes.io/projected/1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600-kube-api-access-mdqj2\") pod \"cinder-operator-controller-manager-984cd4dcf-7lh6p\" (UID: \"1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.375391 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkd2\" (UniqueName: \"kubernetes.io/projected/ab0713f2-46ce-4987-852a-18371473f327-kube-api-access-2qkd2\") pod \"barbican-operator-controller-manager-677bd678f7-xh4cc\" (UID: \"ab0713f2-46ce-4987-852a-18371473f327\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.375427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlgzj\" (UniqueName: \"kubernetes.io/projected/115ee58b-0cd9-4993-b15e-226885cef1d8-kube-api-access-wlgzj\") pod \"glance-operator-controller-manager-5964f64c48-wgnqt\" (UID: \"115ee58b-0cd9-4993-b15e-226885cef1d8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.375477 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4n4\" (UniqueName: \"kubernetes.io/projected/9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73-kube-api-access-kt4n4\") pod \"designate-operator-controller-manager-66d56f6ff4-x5rps\" (UID: \"9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.375587 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q2nr4" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.376475 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.383793 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.384731 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.388113 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vnghk" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.401830 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.402785 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.408718 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.417949 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.418191 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pt8kb" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.423315 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.434056 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.456045 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9gjnm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.460615 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.474414 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482138 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqj2\" (UniqueName: \"kubernetes.io/projected/1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600-kube-api-access-mdqj2\") pod \"cinder-operator-controller-manager-984cd4dcf-7lh6p\" (UID: \"1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482183 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qg5q\" (UniqueName: \"kubernetes.io/projected/4b8d6717-6a3c-4421-83c8-c86ff18d1e3b-kube-api-access-9qg5q\") pod \"horizon-operator-controller-manager-6d9d6b584d-pr4mx\" (UID: \"4b8d6717-6a3c-4421-83c8-c86ff18d1e3b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkd2\" (UniqueName: \"kubernetes.io/projected/ab0713f2-46ce-4987-852a-18371473f327-kube-api-access-2qkd2\") pod \"barbican-operator-controller-manager-677bd678f7-xh4cc\" (UID: \"ab0713f2-46ce-4987-852a-18371473f327\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482250 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktttl\" (UniqueName: \"kubernetes.io/projected/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-kube-api-access-ktttl\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482277 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlgzj\" (UniqueName: \"kubernetes.io/projected/115ee58b-0cd9-4993-b15e-226885cef1d8-kube-api-access-wlgzj\") pod \"glance-operator-controller-manager-5964f64c48-wgnqt\" (UID: \"115ee58b-0cd9-4993-b15e-226885cef1d8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482311 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnf6\" (UniqueName: \"kubernetes.io/projected/c1595560-d9f2-48bf-8b30-f8a36f13e1f4-kube-api-access-6mnf6\") pod \"heat-operator-controller-manager-77b6666d85-gd9tg\" (UID: \"c1595560-d9f2-48bf-8b30-f8a36f13e1f4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482330 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp7kr\" (UniqueName: \"kubernetes.io/projected/cf4745d2-2e5f-4150-9c60-34e91e5f1e80-kube-api-access-hp7kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-m6668\" (UID: \"cf4745d2-2e5f-4150-9c60-34e91e5f1e80\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.482362 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4n4\" (UniqueName: \"kubernetes.io/projected/9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73-kube-api-access-kt4n4\") pod \"designate-operator-controller-manager-66d56f6ff4-x5rps\" (UID: \"9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.484723 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.486031 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.494198 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.499179 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-d7fv7" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.502554 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.503418 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.505683 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sjjlx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.515456 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.516393 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.524315 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.524358 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.525870 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-shp8n" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.530721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlgzj\" (UniqueName: \"kubernetes.io/projected/115ee58b-0cd9-4993-b15e-226885cef1d8-kube-api-access-wlgzj\") pod \"glance-operator-controller-manager-5964f64c48-wgnqt\" (UID: \"115ee58b-0cd9-4993-b15e-226885cef1d8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.548870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4n4\" (UniqueName: \"kubernetes.io/projected/9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73-kube-api-access-kt4n4\") pod \"designate-operator-controller-manager-66d56f6ff4-x5rps\" (UID: \"9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.551155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkd2\" (UniqueName: \"kubernetes.io/projected/ab0713f2-46ce-4987-852a-18371473f327-kube-api-access-2qkd2\") pod \"barbican-operator-controller-manager-677bd678f7-xh4cc\" (UID: \"ab0713f2-46ce-4987-852a-18371473f327\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.564053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqj2\" (UniqueName: \"kubernetes.io/projected/1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600-kube-api-access-mdqj2\") pod \"cinder-operator-controller-manager-984cd4dcf-7lh6p\" (UID: \"1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.589576 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.590206 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qg5q\" (UniqueName: \"kubernetes.io/projected/4b8d6717-6a3c-4421-83c8-c86ff18d1e3b-kube-api-access-9qg5q\") pod \"horizon-operator-controller-manager-6d9d6b584d-pr4mx\" (UID: \"4b8d6717-6a3c-4421-83c8-c86ff18d1e3b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591124 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591150 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktttl\" (UniqueName: \"kubernetes.io/projected/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-kube-api-access-ktttl\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591177 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rkm5\" (UniqueName: \"kubernetes.io/projected/f5ec1f31-33e9-47d8-91bc-450e319479a3-kube-api-access-5rkm5\") pod \"keystone-operator-controller-manager-684f77d66d-c946t\" (UID: \"f5ec1f31-33e9-47d8-91bc-450e319479a3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkccf\" (UniqueName: \"kubernetes.io/projected/6b30e096-2cc6-41f3-aaca-c2d7a3d8b138-kube-api-access-gkccf\") pod \"manila-operator-controller-manager-68f45f9d9f-6rtj8\" (UID: \"6b30e096-2cc6-41f3-aaca-c2d7a3d8b138\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591216 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhk2v\" (UniqueName: \"kubernetes.io/projected/3a8c82be-b391-42a8-a16c-a99850c14b19-kube-api-access-rhk2v\") pod \"mariadb-operator-controller-manager-658d4cdd5-gc9cm\" (UID: \"3a8c82be-b391-42a8-a16c-a99850c14b19\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnf6\" (UniqueName: \"kubernetes.io/projected/c1595560-d9f2-48bf-8b30-f8a36f13e1f4-kube-api-access-6mnf6\") pod \"heat-operator-controller-manager-77b6666d85-gd9tg\" (UID: \"c1595560-d9f2-48bf-8b30-f8a36f13e1f4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.591269 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp7kr\" (UniqueName: \"kubernetes.io/projected/cf4745d2-2e5f-4150-9c60-34e91e5f1e80-kube-api-access-hp7kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-m6668\" (UID: \"cf4745d2-2e5f-4150-9c60-34e91e5f1e80\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.591713 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.591755 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert podName:daef3605-2bdc-4e16-b55f-61d2d3cfc2fd nodeName:}" failed. No retries permitted until 2026-03-11 01:13:04.091739635 +0000 UTC m=+1140.895957240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert") pod "infra-operator-controller-manager-5995f4446f-hxxgl" (UID: "daef3605-2bdc-4e16-b55f-61d2d3cfc2fd") : secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.592366 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.601362 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.602149 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.607300 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.608154 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.615270 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.624141 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.629094 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.639158 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp7kr\" (UniqueName: \"kubernetes.io/projected/cf4745d2-2e5f-4150-9c60-34e91e5f1e80-kube-api-access-hp7kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-m6668\" (UID: \"cf4745d2-2e5f-4150-9c60-34e91e5f1e80\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.639827 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-gzsds" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.639976 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.640032 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tdrg8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.640147 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vzq47" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.656501 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qg5q\" (UniqueName: \"kubernetes.io/projected/4b8d6717-6a3c-4421-83c8-c86ff18d1e3b-kube-api-access-9qg5q\") pod \"horizon-operator-controller-manager-6d9d6b584d-pr4mx\" (UID: \"4b8d6717-6a3c-4421-83c8-c86ff18d1e3b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.656596 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.656842 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.658206 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktttl\" (UniqueName: \"kubernetes.io/projected/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-kube-api-access-ktttl\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.659864 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnf6\" (UniqueName: \"kubernetes.io/projected/c1595560-d9f2-48bf-8b30-f8a36f13e1f4-kube-api-access-6mnf6\") pod \"heat-operator-controller-manager-77b6666d85-gd9tg\" (UID: \"c1595560-d9f2-48bf-8b30-f8a36f13e1f4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.694154 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.694918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rkm5\" (UniqueName: \"kubernetes.io/projected/f5ec1f31-33e9-47d8-91bc-450e319479a3-kube-api-access-5rkm5\") pod \"keystone-operator-controller-manager-684f77d66d-c946t\" (UID: \"f5ec1f31-33e9-47d8-91bc-450e319479a3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkccf\" (UniqueName: \"kubernetes.io/projected/6b30e096-2cc6-41f3-aaca-c2d7a3d8b138-kube-api-access-gkccf\") pod \"manila-operator-controller-manager-68f45f9d9f-6rtj8\" (UID: \"6b30e096-2cc6-41f3-aaca-c2d7a3d8b138\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhk2v\" (UniqueName: \"kubernetes.io/projected/3a8c82be-b391-42a8-a16c-a99850c14b19-kube-api-access-rhk2v\") pod \"mariadb-operator-controller-manager-658d4cdd5-gc9cm\" (UID: \"3a8c82be-b391-42a8-a16c-a99850c14b19\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695224 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6dtz\" (UniqueName: \"kubernetes.io/projected/431b2a18-7bf7-4dda-a49e-15bfa629e1f9-kube-api-access-l6dtz\") pod \"neutron-operator-controller-manager-776c5696bf-wt7zj\" (UID: \"431b2a18-7bf7-4dda-a49e-15bfa629e1f9\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695314 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f2wp\" (UniqueName: \"kubernetes.io/projected/6e760654-b96d-4979-a98f-dc162fc1b41e-kube-api-access-8f2wp\") pod \"nova-operator-controller-manager-569cc54c5-cp8jm\" (UID: \"6e760654-b96d-4979-a98f-dc162fc1b41e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.695334 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdl7b\" (UniqueName: \"kubernetes.io/projected/63676a12-7f22-401f-98e2-eb2495777d96-kube-api-access-bdl7b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-fmvh8\" (UID: \"63676a12-7f22-401f-98e2-eb2495777d96\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.699106 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rd267" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.699230 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.711116 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.715791 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.716593 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.718194 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkccf\" (UniqueName: \"kubernetes.io/projected/6b30e096-2cc6-41f3-aaca-c2d7a3d8b138-kube-api-access-gkccf\") pod \"manila-operator-controller-manager-68f45f9d9f-6rtj8\" (UID: \"6b30e096-2cc6-41f3-aaca-c2d7a3d8b138\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.719851 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.724147 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-q8wgq" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.751277 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rkm5\" (UniqueName: \"kubernetes.io/projected/f5ec1f31-33e9-47d8-91bc-450e319479a3-kube-api-access-5rkm5\") pod \"keystone-operator-controller-manager-684f77d66d-c946t\" (UID: \"f5ec1f31-33e9-47d8-91bc-450e319479a3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.760864 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.763183 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.763922 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.775229 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.775287 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.786166 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.786204 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.786678 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhk2v\" (UniqueName: \"kubernetes.io/projected/3a8c82be-b391-42a8-a16c-a99850c14b19-kube-api-access-rhk2v\") pod \"mariadb-operator-controller-manager-658d4cdd5-gc9cm\" (UID: \"3a8c82be-b391-42a8-a16c-a99850c14b19\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.786954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.787894 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-d2grw" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.790944 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-msghq" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798465 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f925r\" (UniqueName: \"kubernetes.io/projected/ffd0a9c9-75f4-4721-a045-b4f9dc285388-kube-api-access-f925r\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798503 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhww\" (UniqueName: \"kubernetes.io/projected/78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b-kube-api-access-9vhww\") pod \"placement-operator-controller-manager-574d45c66c-wk9q5\" (UID: \"78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f2wp\" (UniqueName: \"kubernetes.io/projected/6e760654-b96d-4979-a98f-dc162fc1b41e-kube-api-access-8f2wp\") pod \"nova-operator-controller-manager-569cc54c5-cp8jm\" (UID: \"6e760654-b96d-4979-a98f-dc162fc1b41e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdl7b\" (UniqueName: \"kubernetes.io/projected/63676a12-7f22-401f-98e2-eb2495777d96-kube-api-access-bdl7b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-fmvh8\" (UID: \"63676a12-7f22-401f-98e2-eb2495777d96\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5598\" (UniqueName: \"kubernetes.io/projected/0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8-kube-api-access-x5598\") pod \"swift-operator-controller-manager-677c674df7-h9rf2\" (UID: \"0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798717 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6dtz\" (UniqueName: \"kubernetes.io/projected/431b2a18-7bf7-4dda-a49e-15bfa629e1f9-kube-api-access-l6dtz\") pod \"neutron-operator-controller-manager-776c5696bf-wt7zj\" (UID: \"431b2a18-7bf7-4dda-a49e-15bfa629e1f9\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798738 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.798759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58sj\" (UniqueName: \"kubernetes.io/projected/c2d319ab-e093-4ff0-8a47-d73b7ffc8a68-kube-api-access-m58sj\") pod \"ovn-operator-controller-manager-bbc5b68f9-mgfnd\" (UID: \"c2d319ab-e093-4ff0-8a47-d73b7ffc8a68\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.810926 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.811761 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.816464 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-shtxf" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.816960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6dtz\" (UniqueName: \"kubernetes.io/projected/431b2a18-7bf7-4dda-a49e-15bfa629e1f9-kube-api-access-l6dtz\") pod \"neutron-operator-controller-manager-776c5696bf-wt7zj\" (UID: \"431b2a18-7bf7-4dda-a49e-15bfa629e1f9\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.823075 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f2wp\" (UniqueName: \"kubernetes.io/projected/6e760654-b96d-4979-a98f-dc162fc1b41e-kube-api-access-8f2wp\") pod \"nova-operator-controller-manager-569cc54c5-cp8jm\" (UID: \"6e760654-b96d-4979-a98f-dc162fc1b41e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.827906 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.829310 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.837081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdl7b\" (UniqueName: \"kubernetes.io/projected/63676a12-7f22-401f-98e2-eb2495777d96-kube-api-access-bdl7b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-fmvh8\" (UID: \"63676a12-7f22-401f-98e2-eb2495777d96\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.843096 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.882864 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.893263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.896270 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qk5mj" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5598\" (UniqueName: \"kubernetes.io/projected/0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8-kube-api-access-x5598\") pod \"swift-operator-controller-manager-677c674df7-h9rf2\" (UID: \"0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904729 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvf9\" (UniqueName: \"kubernetes.io/projected/cf1ef450-174d-43d9-b921-ce85078476b4-kube-api-access-6kvf9\") pod \"test-operator-controller-manager-5c5cb9c4d7-rrbl2\" (UID: \"cf1ef450-174d-43d9-b921-ce85078476b4\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904827 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8f8s\" (UniqueName: \"kubernetes.io/projected/1b8a814c-cd04-49bf-b774-441c96a1faa4-kube-api-access-f8f8s\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-qrwcx\" (UID: \"1b8a814c-cd04-49bf-b774-441c96a1faa4\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904868 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58sj\" (UniqueName: \"kubernetes.io/projected/c2d319ab-e093-4ff0-8a47-d73b7ffc8a68-kube-api-access-m58sj\") pod \"ovn-operator-controller-manager-bbc5b68f9-mgfnd\" (UID: \"c2d319ab-e093-4ff0-8a47-d73b7ffc8a68\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f925r\" (UniqueName: \"kubernetes.io/projected/ffd0a9c9-75f4-4721-a045-b4f9dc285388-kube-api-access-f925r\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.904956 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhww\" (UniqueName: \"kubernetes.io/projected/78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b-kube-api-access-9vhww\") pod \"placement-operator-controller-manager-574d45c66c-wk9q5\" (UID: \"78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.915875 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.922627 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2"] Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.925708 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.926088 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:03 crc kubenswrapper[4744]: E0311 01:13:03.926866 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert podName:ffd0a9c9-75f4-4721-a045-b4f9dc285388 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:04.426842355 +0000 UTC m=+1141.231059960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" (UID: "ffd0a9c9-75f4-4721-a045-b4f9dc285388") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.928553 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-shp8n" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.940652 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.919283 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sjjlx" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.958594 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f925r\" (UniqueName: \"kubernetes.io/projected/ffd0a9c9-75f4-4721-a045-b4f9dc285388-kube-api-access-f925r\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.968632 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tdrg8" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.971649 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:03 crc kubenswrapper[4744]: I0311 01:13:03.981168 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58sj\" (UniqueName: \"kubernetes.io/projected/c2d319ab-e093-4ff0-8a47-d73b7ffc8a68-kube-api-access-m58sj\") pod \"ovn-operator-controller-manager-bbc5b68f9-mgfnd\" (UID: \"c2d319ab-e093-4ff0-8a47-d73b7ffc8a68\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.006918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvf9\" (UniqueName: \"kubernetes.io/projected/cf1ef450-174d-43d9-b921-ce85078476b4-kube-api-access-6kvf9\") pod \"test-operator-controller-manager-5c5cb9c4d7-rrbl2\" (UID: \"cf1ef450-174d-43d9-b921-ce85078476b4\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.007001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8f8s\" (UniqueName: \"kubernetes.io/projected/1b8a814c-cd04-49bf-b774-441c96a1faa4-kube-api-access-f8f8s\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-qrwcx\" (UID: \"1b8a814c-cd04-49bf-b774-441c96a1faa4\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.013418 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5598\" (UniqueName: \"kubernetes.io/projected/0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8-kube-api-access-x5598\") pod \"swift-operator-controller-manager-677c674df7-h9rf2\" (UID: \"0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.015611 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhww\" (UniqueName: \"kubernetes.io/projected/78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b-kube-api-access-9vhww\") pod \"placement-operator-controller-manager-574d45c66c-wk9q5\" (UID: \"78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.039294 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.040551 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.040653 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.043373 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.043769 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nvn64" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.044800 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.052305 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvf9\" (UniqueName: \"kubernetes.io/projected/cf1ef450-174d-43d9-b921-ce85078476b4-kube-api-access-6kvf9\") pod \"test-operator-controller-manager-5c5cb9c4d7-rrbl2\" (UID: \"cf1ef450-174d-43d9-b921-ce85078476b4\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.058284 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8f8s\" (UniqueName: \"kubernetes.io/projected/1b8a814c-cd04-49bf-b774-441c96a1faa4-kube-api-access-f8f8s\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-qrwcx\" (UID: \"1b8a814c-cd04-49bf-b774-441c96a1faa4\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.058901 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-gzsds" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.061331 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jll7t" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.064861 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.065497 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.065983 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.078625 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.095031 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vzq47" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.104180 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.108368 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.108402 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbhj\" (UniqueName: \"kubernetes.io/projected/f341be86-aa09-4703-aa02-29ef571ad003-kube-api-access-vlbhj\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.108479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.108495 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzjq\" (UniqueName: \"kubernetes.io/projected/27f5807c-3d27-4ffc-bcc7-c43a07d04fa1-kube-api-access-zqzjq\") pod \"watcher-operator-controller-manager-6dd88c6f67-qtvcf\" (UID: \"27f5807c-3d27-4ffc-bcc7-c43a07d04fa1\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.108555 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.108655 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.108699 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert podName:daef3605-2bdc-4e16-b55f-61d2d3cfc2fd nodeName:}" failed. No retries permitted until 2026-03-11 01:13:05.108686267 +0000 UTC m=+1141.912903872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert") pod "infra-operator-controller-manager-5995f4446f-hxxgl" (UID: "daef3605-2bdc-4e16-b55f-61d2d3cfc2fd") : secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.126859 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-msghq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.127040 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.141089 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-q8wgq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.149345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.155602 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.156440 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.163388 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kw254" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.163713 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.168322 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-d2grw" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.168500 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-shtxf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.177791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.177848 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.211319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.211358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzjq\" (UniqueName: \"kubernetes.io/projected/27f5807c-3d27-4ffc-bcc7-c43a07d04fa1-kube-api-access-zqzjq\") pod \"watcher-operator-controller-manager-6dd88c6f67-qtvcf\" (UID: \"27f5807c-3d27-4ffc-bcc7-c43a07d04fa1\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.211416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.211434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlbhj\" (UniqueName: \"kubernetes.io/projected/f341be86-aa09-4703-aa02-29ef571ad003-kube-api-access-vlbhj\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.211924 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.211975 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:04.711957882 +0000 UTC m=+1141.516175487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.212033 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.212114 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:04.712096586 +0000 UTC m=+1141.516314191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.233298 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlbhj\" (UniqueName: \"kubernetes.io/projected/f341be86-aa09-4703-aa02-29ef571ad003-kube-api-access-vlbhj\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.238961 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzjq\" (UniqueName: \"kubernetes.io/projected/27f5807c-3d27-4ffc-bcc7-c43a07d04fa1-kube-api-access-zqzjq\") pod \"watcher-operator-controller-manager-6dd88c6f67-qtvcf\" (UID: \"27f5807c-3d27-4ffc-bcc7-c43a07d04fa1\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.246777 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.312558 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxnk\" (UniqueName: \"kubernetes.io/projected/5f30ecde-348a-43ce-980f-b27ebd7971bb-kube-api-access-nlxnk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-924zq\" (UID: \"5f30ecde-348a-43ce-980f-b27ebd7971bb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.312614 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.328302 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc"] Mar 11 01:13:04 crc kubenswrapper[4744]: W0311 01:13:04.360224 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0713f2_46ce_4987_852a_18371473f327.slice/crio-d6760c41dd51b429b03a4b01da17f45405f753113e51c3126a2413c0960612b0 WatchSource:0}: Error finding container d6760c41dd51b429b03a4b01da17f45405f753113e51c3126a2413c0960612b0: Status 404 returned error can't find the container with id d6760c41dd51b429b03a4b01da17f45405f753113e51c3126a2413c0960612b0 Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.391325 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.415271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxnk\" (UniqueName: \"kubernetes.io/projected/5f30ecde-348a-43ce-980f-b27ebd7971bb-kube-api-access-nlxnk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-924zq\" (UID: \"5f30ecde-348a-43ce-980f-b27ebd7971bb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.438259 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxnk\" (UniqueName: \"kubernetes.io/projected/5f30ecde-348a-43ce-980f-b27ebd7971bb-kube-api-access-nlxnk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-924zq\" (UID: \"5f30ecde-348a-43ce-980f-b27ebd7971bb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.501320 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.515970 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.516132 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.516182 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert podName:ffd0a9c9-75f4-4721-a045-b4f9dc285388 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:05.516167512 +0000 UTC m=+1142.320385117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" (UID: "ffd0a9c9-75f4-4721-a045-b4f9dc285388") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.613132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" event={"ID":"ab0713f2-46ce-4987-852a-18371473f327","Type":"ContainerStarted","Data":"d6760c41dd51b429b03a4b01da17f45405f753113e51c3126a2413c0960612b0"} Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.614636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" event={"ID":"1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600","Type":"ContainerStarted","Data":"b5634a4e74ad9cd507047f1e4269469f399f2bfacc2c2a87d52e2c34926e5d65"} Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.701082 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.711924 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.718013 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.719330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.719410 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.719538 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.719591 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:05.719577024 +0000 UTC m=+1142.523794619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.719591 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: E0311 01:13:04.719658 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:05.719639765 +0000 UTC m=+1142.523857370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.730276 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.740730 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.743041 4744 scope.go:117] "RemoveContainer" containerID="b1d97507e646888ad42be397ac17323b3164949551d9beb6c43aab54c70713f0" Mar 11 01:13:04 crc kubenswrapper[4744]: W0311 01:13:04.743760 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b30e096_2cc6_41f3_aaca_c2d7a3d8b138.slice/crio-ad24a7f291a52609d9f6c3a7cc71e9a24f177c59f76af52bea456f0e470e675d WatchSource:0}: Error finding container ad24a7f291a52609d9f6c3a7cc71e9a24f177c59f76af52bea456f0e470e675d: Status 404 returned error can't find the container with id ad24a7f291a52609d9f6c3a7cc71e9a24f177c59f76af52bea456f0e470e675d Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.750889 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.925055 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.931373 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.936897 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg"] Mar 11 01:13:04 crc kubenswrapper[4744]: I0311 01:13:04.946666 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8"] Mar 11 01:13:04 crc kubenswrapper[4744]: W0311 01:13:04.947745 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1595560_d9f2_48bf_8b30_f8a36f13e1f4.slice/crio-bf68bc1b1e257ce09ccd56965562f93b6d5cb694e3d431a1e4f54a6c55f8695d WatchSource:0}: Error finding container bf68bc1b1e257ce09ccd56965562f93b6d5cb694e3d431a1e4f54a6c55f8695d: Status 404 returned error can't find the container with id bf68bc1b1e257ce09ccd56965562f93b6d5cb694e3d431a1e4f54a6c55f8695d Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.057609 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5"] Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.070793 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2"] Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.077652 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm"] Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.083569 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj"] Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.090003 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2"] Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.094586 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1ef450_174d_43d9_b921_ce85078476b4.slice/crio-e5d4f794103e4f9ca088ba185528e1d69a87f82dc59bb623bb4038453817de1d WatchSource:0}: Error finding container e5d4f794103e4f9ca088ba185528e1d69a87f82dc59bb623bb4038453817de1d: Status 404 returned error can't find the container with id e5d4f794103e4f9ca088ba185528e1d69a87f82dc59bb623bb4038453817de1d Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.094818 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf"] Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.099497 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd"] Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.105618 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6kvf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-rrbl2_openstack-operators(cf1ef450-174d-43d9-b921-ce85078476b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.106849 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" podUID="cf1ef450-174d-43d9-b921-ce85078476b4" Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.107541 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a8c82be_b391_42a8_a16c_a99850c14b19.slice/crio-7c63a5eac74252947f67afaac9de8e7c1477411265a16b358b9dd719bf414142 WatchSource:0}: Error finding container 7c63a5eac74252947f67afaac9de8e7c1477411265a16b358b9dd719bf414142: Status 404 returned error can't find the container with id 7c63a5eac74252947f67afaac9de8e7c1477411265a16b358b9dd719bf414142 Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.108609 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq"] Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.111101 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhk2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-gc9cm_openstack-operators(3a8c82be-b391-42a8-a16c-a99850c14b19): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.112206 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" podUID="3a8c82be-b391-42a8-a16c-a99850c14b19" Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.112742 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27f5807c_3d27_4ffc_bcc7_c43a07d04fa1.slice/crio-f3386468e746ebe60f19a359a8d28602abf18615f9edb2463d52124d99eafada WatchSource:0}: Error finding container f3386468e746ebe60f19a359a8d28602abf18615f9edb2463d52124d99eafada: Status 404 returned error can't find the container with id f3386468e746ebe60f19a359a8d28602abf18615f9edb2463d52124d99eafada Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.114359 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d319ab_e093_4ff0_8a47_d73b7ffc8a68.slice/crio-ee14a3a4e1e801d2a691aa3dc31c54f8cb85b1c0815380a6f41e5ee18e849c03 WatchSource:0}: Error finding container ee14a3a4e1e801d2a691aa3dc31c54f8cb85b1c0815380a6f41e5ee18e849c03: Status 404 returned error can't find the container with id ee14a3a4e1e801d2a691aa3dc31c54f8cb85b1c0815380a6f41e5ee18e849c03 Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.115345 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8c19f0_aa8c_4ab0_9b6a_7600684d5bc8.slice/crio-ea24ee95f434e9704d9f74bc986459e3805a1028722882032f16b3a2d4e78ecf WatchSource:0}: Error finding container ea24ee95f434e9704d9f74bc986459e3805a1028722882032f16b3a2d4e78ecf: Status 404 returned error can't find the container with id ea24ee95f434e9704d9f74bc986459e3805a1028722882032f16b3a2d4e78ecf Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.115909 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f30ecde_348a_43ce_980f_b27ebd7971bb.slice/crio-5fe11973c443ed0630eb2fc5c165df381f6bb8b0e587a06d77eedb5ec81f42d4 WatchSource:0}: Error finding container 5fe11973c443ed0630eb2fc5c165df381f6bb8b0e587a06d77eedb5ec81f42d4: Status 404 returned error can't find the container with id 5fe11973c443ed0630eb2fc5c165df381f6bb8b0e587a06d77eedb5ec81f42d4 Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.117467 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x5598,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-h9rf2_openstack-operators(0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.117622 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m58sj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-mgfnd_openstack-operators(c2d319ab-e093-4ff0-8a47-d73b7ffc8a68): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: W0311 01:13:05.118189 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b3e53a_dda4_4cc9_bd65_e2bcaedb3d2b.slice/crio-21725ecac41deace29fbf20fb1bacf8940cb49ffc1fd8e272158505841d922a1 WatchSource:0}: Error finding container 21725ecac41deace29fbf20fb1bacf8940cb49ffc1fd8e272158505841d922a1: Status 404 returned error can't find the container with id 21725ecac41deace29fbf20fb1bacf8940cb49ffc1fd8e272158505841d922a1 Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.118573 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" podUID="0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.118681 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" podUID="c2d319ab-e093-4ff0-8a47-d73b7ffc8a68" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.119061 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zqzjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-qtvcf_openstack-operators(27f5807c-3d27-4ffc-bcc7-c43a07d04fa1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.119186 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nlxnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-924zq_openstack-operators(5f30ecde-348a-43ce-980f-b27ebd7971bb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.120253 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" podUID="5f30ecde-348a-43ce-980f-b27ebd7971bb" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.120279 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" podUID="27f5807c-3d27-4ffc-bcc7-c43a07d04fa1" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.120850 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vhww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-wk9q5_openstack-operators(78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.122782 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" podUID="78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.126022 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.127233 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.127266 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert podName:daef3605-2bdc-4e16-b55f-61d2d3cfc2fd nodeName:}" failed. No retries permitted until 2026-03-11 01:13:07.127253605 +0000 UTC m=+1143.931471200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert") pod "infra-operator-controller-manager-5995f4446f-hxxgl" (UID: "daef3605-2bdc-4e16-b55f-61d2d3cfc2fd") : secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.530468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.530658 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.531254 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert podName:ffd0a9c9-75f4-4721-a045-b4f9dc285388 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:07.530978313 +0000 UTC m=+1144.335195918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" (UID: "ffd0a9c9-75f4-4721-a045-b4f9dc285388") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.631003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" event={"ID":"27f5807c-3d27-4ffc-bcc7-c43a07d04fa1","Type":"ContainerStarted","Data":"f3386468e746ebe60f19a359a8d28602abf18615f9edb2463d52124d99eafada"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.632877 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" event={"ID":"63676a12-7f22-401f-98e2-eb2495777d96","Type":"ContainerStarted","Data":"b5e0623f49398df14bcf7bbe496180437133db9f30d43ca8052399f0ff08ea50"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.635372 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" event={"ID":"5f30ecde-348a-43ce-980f-b27ebd7971bb","Type":"ContainerStarted","Data":"5fe11973c443ed0630eb2fc5c165df381f6bb8b0e587a06d77eedb5ec81f42d4"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.636922 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" podUID="27f5807c-3d27-4ffc-bcc7-c43a07d04fa1" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.642838 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" podUID="5f30ecde-348a-43ce-980f-b27ebd7971bb" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.645244 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" event={"ID":"f5ec1f31-33e9-47d8-91bc-450e319479a3","Type":"ContainerStarted","Data":"8f87df851dd31db86cf248a0b98f953d80ab488f0abc6ba8adf38fbbff1f5a00"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.649704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" event={"ID":"9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73","Type":"ContainerStarted","Data":"c92e549dfdd2af9df48ec99a4895970aef3396b8efb87916a6621486642b9090"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.651878 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" event={"ID":"3a8c82be-b391-42a8-a16c-a99850c14b19","Type":"ContainerStarted","Data":"7c63a5eac74252947f67afaac9de8e7c1477411265a16b358b9dd719bf414142"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.653898 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" podUID="3a8c82be-b391-42a8-a16c-a99850c14b19" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.654906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" event={"ID":"78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b","Type":"ContainerStarted","Data":"21725ecac41deace29fbf20fb1bacf8940cb49ffc1fd8e272158505841d922a1"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.656107 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" podUID="78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.660527 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" event={"ID":"6b30e096-2cc6-41f3-aaca-c2d7a3d8b138","Type":"ContainerStarted","Data":"ad24a7f291a52609d9f6c3a7cc71e9a24f177c59f76af52bea456f0e470e675d"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.663408 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" event={"ID":"cf4745d2-2e5f-4150-9c60-34e91e5f1e80","Type":"ContainerStarted","Data":"c584e66e55503fa4bd46a15579572389b3beac02db8ea6b4ef57ee5c1fbf8eab"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.663939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" event={"ID":"c1595560-d9f2-48bf-8b30-f8a36f13e1f4","Type":"ContainerStarted","Data":"bf68bc1b1e257ce09ccd56965562f93b6d5cb694e3d431a1e4f54a6c55f8695d"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.664906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" event={"ID":"0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8","Type":"ContainerStarted","Data":"ea24ee95f434e9704d9f74bc986459e3805a1028722882032f16b3a2d4e78ecf"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.667324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" event={"ID":"6e760654-b96d-4979-a98f-dc162fc1b41e","Type":"ContainerStarted","Data":"b7bce7a9cf9e046651b2b950db60730a78b67fb53e66c62931d8eca23e2237e0"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.670281 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" podUID="0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.672672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" event={"ID":"1b8a814c-cd04-49bf-b774-441c96a1faa4","Type":"ContainerStarted","Data":"ee0634939f7197688ae155ece7844ec2af26766a0ca0b348c581d606a5741a92"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.674728 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" event={"ID":"431b2a18-7bf7-4dda-a49e-15bfa629e1f9","Type":"ContainerStarted","Data":"d206a3371e7c8acbd03f0fac8d4e87be799d2c06028370f83a42fd4f360bf2bd"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.677478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" event={"ID":"cf1ef450-174d-43d9-b921-ce85078476b4","Type":"ContainerStarted","Data":"e5d4f794103e4f9ca088ba185528e1d69a87f82dc59bb623bb4038453817de1d"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.679761 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" podUID="cf1ef450-174d-43d9-b921-ce85078476b4" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.692737 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" event={"ID":"4b8d6717-6a3c-4421-83c8-c86ff18d1e3b","Type":"ContainerStarted","Data":"dbf5d988acb5be08a34486c0e2b273e2e8467a57de5069ba76f0b9ba58e0c6e9"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.695370 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" event={"ID":"c2d319ab-e093-4ff0-8a47-d73b7ffc8a68","Type":"ContainerStarted","Data":"ee14a3a4e1e801d2a691aa3dc31c54f8cb85b1c0815380a6f41e5ee18e849c03"} Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.696773 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" podUID="c2d319ab-e093-4ff0-8a47-d73b7ffc8a68" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.697061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" event={"ID":"115ee58b-0cd9-4993-b15e-226885cef1d8","Type":"ContainerStarted","Data":"1762e10dc49b39e85022635ff5ac05b2ef31cc95fbccca6e40be4665197556c5"} Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.732391 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:05 crc kubenswrapper[4744]: I0311 01:13:05.732569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.732720 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.732778 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:07.732759525 +0000 UTC m=+1144.536977130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.733668 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:05 crc kubenswrapper[4744]: E0311 01:13:05.733745 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:07.733729074 +0000 UTC m=+1144.537946679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712146 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" podUID="0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712156 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" podUID="cf1ef450-174d-43d9-b921-ce85078476b4" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712197 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" podUID="27f5807c-3d27-4ffc-bcc7-c43a07d04fa1" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712237 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" podUID="3a8c82be-b391-42a8-a16c-a99850c14b19" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712267 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" podUID="5f30ecde-348a-43ce-980f-b27ebd7971bb" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712305 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" podUID="78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b" Mar 11 01:13:06 crc kubenswrapper[4744]: E0311 01:13:06.712326 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" podUID="c2d319ab-e093-4ff0-8a47-d73b7ffc8a68" Mar 11 01:13:07 crc kubenswrapper[4744]: I0311 01:13:07.157033 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.157200 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.157554 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert podName:daef3605-2bdc-4e16-b55f-61d2d3cfc2fd nodeName:}" failed. No retries permitted until 2026-03-11 01:13:11.157531598 +0000 UTC m=+1147.961749223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert") pod "infra-operator-controller-manager-5995f4446f-hxxgl" (UID: "daef3605-2bdc-4e16-b55f-61d2d3cfc2fd") : secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: I0311 01:13:07.563237 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.563450 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.563503 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert podName:ffd0a9c9-75f4-4721-a045-b4f9dc285388 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:11.563486325 +0000 UTC m=+1148.367703930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" (UID: "ffd0a9c9-75f4-4721-a045-b4f9dc285388") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: I0311 01:13:07.765941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:07 crc kubenswrapper[4744]: I0311 01:13:07.766045 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.766188 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.766200 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.766242 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:11.766228576 +0000 UTC m=+1148.570446181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:07 crc kubenswrapper[4744]: E0311 01:13:07.766309 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:11.766277498 +0000 UTC m=+1148.570495143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: I0311 01:13:11.223947 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.224106 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.224314 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert podName:daef3605-2bdc-4e16-b55f-61d2d3cfc2fd nodeName:}" failed. No retries permitted until 2026-03-11 01:13:19.224297625 +0000 UTC m=+1156.028515230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert") pod "infra-operator-controller-manager-5995f4446f-hxxgl" (UID: "daef3605-2bdc-4e16-b55f-61d2d3cfc2fd") : secret "infra-operator-webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: I0311 01:13:11.629306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.629421 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.629464 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert podName:ffd0a9c9-75f4-4721-a045-b4f9dc285388 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:19.629451537 +0000 UTC m=+1156.433669142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" (UID: "ffd0a9c9-75f4-4721-a045-b4f9dc285388") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: I0311 01:13:11.831462 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.831595 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: I0311 01:13:11.831891 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.831908 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:19.831891239 +0000 UTC m=+1156.636108844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.832014 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:11 crc kubenswrapper[4744]: E0311 01:13:11.832059 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:19.832046274 +0000 UTC m=+1156.636263969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:12 crc kubenswrapper[4744]: I0311 01:13:12.409576 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:13:12 crc kubenswrapper[4744]: I0311 01:13:12.409749 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.778872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" event={"ID":"ab0713f2-46ce-4987-852a-18371473f327","Type":"ContainerStarted","Data":"841f836d482288b05720a2d968debed05287b1c3725f7ba1c30916f1e9d3ecda"} Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.779395 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.780791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" event={"ID":"1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600","Type":"ContainerStarted","Data":"c36fa2176992e6d4859cf0603c3677a5d41e988d4c84695e89eed4fd4c31c2da"} Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.780933 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.783021 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.802240 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" podStartSLOduration=1.943579756 podStartE2EDuration="12.802228025s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.367934622 +0000 UTC m=+1141.172152227" lastFinishedPulling="2026-03-11 01:13:15.226582891 +0000 UTC m=+1152.030800496" observedRunningTime="2026-03-11 01:13:15.800832251 +0000 UTC m=+1152.605049856" watchObservedRunningTime="2026-03-11 01:13:15.802228025 +0000 UTC m=+1152.606445630" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.826090 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" podStartSLOduration=2.501189578 podStartE2EDuration="12.826074584s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.354586768 +0000 UTC m=+1141.158804373" lastFinishedPulling="2026-03-11 01:13:14.679471764 +0000 UTC m=+1151.483689379" observedRunningTime="2026-03-11 01:13:15.823260177 +0000 UTC m=+1152.627477782" watchObservedRunningTime="2026-03-11 01:13:15.826074584 +0000 UTC m=+1152.630292179" Mar 11 01:13:15 crc kubenswrapper[4744]: I0311 01:13:15.849814 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" podStartSLOduration=2.395914802 podStartE2EDuration="12.849798641s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772721873 +0000 UTC m=+1141.576939488" lastFinishedPulling="2026-03-11 01:13:15.226605722 +0000 UTC m=+1152.030823327" observedRunningTime="2026-03-11 01:13:15.844190487 +0000 UTC m=+1152.648408092" watchObservedRunningTime="2026-03-11 01:13:15.849798641 +0000 UTC m=+1152.654016236" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.790598 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" event={"ID":"6e760654-b96d-4979-a98f-dc162fc1b41e","Type":"ContainerStarted","Data":"17a846f512f4a1fc401faeda66cfd7e466a84d9199aac78804230729d3fa9c02"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.792077 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.793772 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" event={"ID":"6b30e096-2cc6-41f3-aaca-c2d7a3d8b138","Type":"ContainerStarted","Data":"439292f4fd92ff7b05ace8d98e2cf08c0e4248678e1113cbfcc37e8b579321b4"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.793873 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.795435 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" event={"ID":"1b8a814c-cd04-49bf-b774-441c96a1faa4","Type":"ContainerStarted","Data":"9bdd53998a534212842cc95bd8a510e0be9f38ad33bc0ff5d5b043e5bc2d174b"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.795503 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.797008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" event={"ID":"115ee58b-0cd9-4993-b15e-226885cef1d8","Type":"ContainerStarted","Data":"d45bdb1c503ab978b6fc59a44305e598b88f32b95feeefa83aaae05c94a3f3f5"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.797069 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.798620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" event={"ID":"431b2a18-7bf7-4dda-a49e-15bfa629e1f9","Type":"ContainerStarted","Data":"c7ad320396a6a52099b132563892e9f91ea54d8868d4045c4beb2a9f1ac082ae"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.800329 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" event={"ID":"f5ec1f31-33e9-47d8-91bc-450e319479a3","Type":"ContainerStarted","Data":"d20bedf9c13745dcafeab893197d7350015014faac1fb5709ad30f73cd0da770"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.800415 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.801796 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" event={"ID":"4b8d6717-6a3c-4421-83c8-c86ff18d1e3b","Type":"ContainerStarted","Data":"359d7eb385280efc5b1183464efef6f75b991d1bd21160ff158bf2168ca950ac"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.801949 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.803326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" event={"ID":"cf4745d2-2e5f-4150-9c60-34e91e5f1e80","Type":"ContainerStarted","Data":"3eadf2841164359bde124d7ca818573a472d99f1c93fdd2b9fac0be7c0bd80a6"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.803437 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.804621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" event={"ID":"c1595560-d9f2-48bf-8b30-f8a36f13e1f4","Type":"ContainerStarted","Data":"1f91dd577b7f03d43c0e2bc1539aa162a31f66a06f7aa9a17d2fdb037f5c08e0"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.804881 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.806258 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" event={"ID":"63676a12-7f22-401f-98e2-eb2495777d96","Type":"ContainerStarted","Data":"5284e03e6c78d73a42903529427b00024d7d30ddc888cf82ce57c63b49b5bee7"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.806383 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.808069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" event={"ID":"9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73","Type":"ContainerStarted","Data":"cf648127dad3091a2e34fcc51bb359ff8cf3364a34ef449b8b17337a81aa4291"} Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.822266 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" podStartSLOduration=3.470220178 podStartE2EDuration="13.822250817s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.94374368 +0000 UTC m=+1141.747961285" lastFinishedPulling="2026-03-11 01:13:15.295774319 +0000 UTC m=+1152.099991924" observedRunningTime="2026-03-11 01:13:16.820643227 +0000 UTC m=+1153.624860832" watchObservedRunningTime="2026-03-11 01:13:16.822250817 +0000 UTC m=+1153.626468422" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.870820 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" podStartSLOduration=3.648940255 podStartE2EDuration="13.870803404s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.093489357 +0000 UTC m=+1141.897706962" lastFinishedPulling="2026-03-11 01:13:15.315352506 +0000 UTC m=+1152.119570111" observedRunningTime="2026-03-11 01:13:16.864423546 +0000 UTC m=+1153.668641151" watchObservedRunningTime="2026-03-11 01:13:16.870803404 +0000 UTC m=+1153.675021009" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.908555 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" podStartSLOduration=4.180805059 podStartE2EDuration="13.908541845s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.951731768 +0000 UTC m=+1141.755949373" lastFinishedPulling="2026-03-11 01:13:14.679468554 +0000 UTC m=+1151.483686159" observedRunningTime="2026-03-11 01:13:16.906832172 +0000 UTC m=+1153.711049787" watchObservedRunningTime="2026-03-11 01:13:16.908541845 +0000 UTC m=+1153.712759450" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.977293 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" podStartSLOduration=3.52504053 podStartE2EDuration="13.977280448s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772982681 +0000 UTC m=+1141.577200316" lastFinishedPulling="2026-03-11 01:13:15.225222629 +0000 UTC m=+1152.029440234" observedRunningTime="2026-03-11 01:13:16.947636878 +0000 UTC m=+1153.751854483" watchObservedRunningTime="2026-03-11 01:13:16.977280448 +0000 UTC m=+1153.781498053" Mar 11 01:13:16 crc kubenswrapper[4744]: I0311 01:13:16.979351 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" podStartSLOduration=3.517351481 podStartE2EDuration="13.979346402s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772680532 +0000 UTC m=+1141.576898147" lastFinishedPulling="2026-03-11 01:13:15.234675473 +0000 UTC m=+1152.038893068" observedRunningTime="2026-03-11 01:13:16.973354676 +0000 UTC m=+1153.777572281" watchObservedRunningTime="2026-03-11 01:13:16.979346402 +0000 UTC m=+1153.783564007" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.019414 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" podStartSLOduration=4.112025955 podStartE2EDuration="14.019391925s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772672682 +0000 UTC m=+1141.576890287" lastFinishedPulling="2026-03-11 01:13:14.680038652 +0000 UTC m=+1151.484256257" observedRunningTime="2026-03-11 01:13:17.000883341 +0000 UTC m=+1153.805100956" watchObservedRunningTime="2026-03-11 01:13:17.019391925 +0000 UTC m=+1153.823609530" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.022498 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" podStartSLOduration=3.545311179 podStartE2EDuration="14.022489351s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772772755 +0000 UTC m=+1141.576990360" lastFinishedPulling="2026-03-11 01:13:15.249950927 +0000 UTC m=+1152.054168532" observedRunningTime="2026-03-11 01:13:17.016865906 +0000 UTC m=+1153.821083511" watchObservedRunningTime="2026-03-11 01:13:17.022489351 +0000 UTC m=+1153.826706956" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.046465 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" podStartSLOduration=3.460090314 podStartE2EDuration="14.046449294s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.772785665 +0000 UTC m=+1141.577003260" lastFinishedPulling="2026-03-11 01:13:15.359144635 +0000 UTC m=+1152.163362240" observedRunningTime="2026-03-11 01:13:17.039105387 +0000 UTC m=+1153.843322992" watchObservedRunningTime="2026-03-11 01:13:17.046449294 +0000 UTC m=+1153.850666899" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.063907 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" podStartSLOduration=3.756373289 podStartE2EDuration="14.063891836s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.942925975 +0000 UTC m=+1141.747143590" lastFinishedPulling="2026-03-11 01:13:15.250444532 +0000 UTC m=+1152.054662137" observedRunningTime="2026-03-11 01:13:17.061804541 +0000 UTC m=+1153.866022146" watchObservedRunningTime="2026-03-11 01:13:17.063891836 +0000 UTC m=+1153.868109441" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.082818 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" podStartSLOduration=3.810313491 podStartE2EDuration="14.082804552s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:04.952714968 +0000 UTC m=+1141.756932573" lastFinishedPulling="2026-03-11 01:13:15.225206029 +0000 UTC m=+1152.029423634" observedRunningTime="2026-03-11 01:13:17.078390986 +0000 UTC m=+1153.882608591" watchObservedRunningTime="2026-03-11 01:13:17.082804552 +0000 UTC m=+1153.887022157" Mar 11 01:13:17 crc kubenswrapper[4744]: I0311 01:13:17.819570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.241183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.261107 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daef3605-2bdc-4e16-b55f-61d2d3cfc2fd-cert\") pod \"infra-operator-controller-manager-5995f4446f-hxxgl\" (UID: \"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.339599 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pt8kb" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.347811 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.645417 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl"] Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.647846 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.654999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffd0a9c9-75f4-4721-a045-b4f9dc285388-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8pxr2\" (UID: \"ffd0a9c9-75f4-4721-a045-b4f9dc285388\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.703777 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rd267" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.711765 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.832705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" event={"ID":"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd","Type":"ContainerStarted","Data":"843775a3be7253b98da589f82985fcd020b9de6bedfa83173cbbe619da644b77"} Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.834298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" event={"ID":"3a8c82be-b391-42a8-a16c-a99850c14b19","Type":"ContainerStarted","Data":"8bf234ced0202c3bd81e0b0e9f214f867ebef7b4b9921877e2a0e9872bddad48"} Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.834985 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.850629 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" podStartSLOduration=2.97137227 podStartE2EDuration="16.850612672s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.1109697 +0000 UTC m=+1141.915187305" lastFinishedPulling="2026-03-11 01:13:18.990210102 +0000 UTC m=+1155.794427707" observedRunningTime="2026-03-11 01:13:19.849958851 +0000 UTC m=+1156.654176456" watchObservedRunningTime="2026-03-11 01:13:19.850612672 +0000 UTC m=+1156.654830277" Mar 11 01:13:19 crc kubenswrapper[4744]: E0311 01:13:19.852061 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 01:13:19 crc kubenswrapper[4744]: E0311 01:13:19.852159 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:35.852135908 +0000 UTC m=+1172.656353513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "metrics-server-cert" not found Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.852255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:19 crc kubenswrapper[4744]: I0311 01:13:19.852429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:19 crc kubenswrapper[4744]: E0311 01:13:19.852898 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 01:13:19 crc kubenswrapper[4744]: E0311 01:13:19.852945 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs podName:f341be86-aa09-4703-aa02-29ef571ad003 nodeName:}" failed. No retries permitted until 2026-03-11 01:13:35.852929463 +0000 UTC m=+1172.657147068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-8trgb" (UID: "f341be86-aa09-4703-aa02-29ef571ad003") : secret "webhook-server-cert" not found Mar 11 01:13:20 crc kubenswrapper[4744]: I0311 01:13:20.143958 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2"] Mar 11 01:13:20 crc kubenswrapper[4744]: I0311 01:13:20.841502 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" event={"ID":"ffd0a9c9-75f4-4721-a045-b4f9dc285388","Type":"ContainerStarted","Data":"e06b8606026420210454e2047c297ad0a9038258eaa6c41f0b911f8fecaa690d"} Mar 11 01:13:21 crc kubenswrapper[4744]: I0311 01:13:21.857343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" event={"ID":"78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b","Type":"ContainerStarted","Data":"bec42c816724816f242b3cb53dca41444046f7524216449febe8301508a291c0"} Mar 11 01:13:21 crc kubenswrapper[4744]: I0311 01:13:21.858732 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:21 crc kubenswrapper[4744]: I0311 01:13:21.867629 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" event={"ID":"cf1ef450-174d-43d9-b921-ce85078476b4","Type":"ContainerStarted","Data":"4b452a483e3dcdf0c737a84448616228ca2def56ec9e363556f84e10c6f80d2b"} Mar 11 01:13:21 crc kubenswrapper[4744]: I0311 01:13:21.868333 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:21 crc kubenswrapper[4744]: I0311 01:13:21.885966 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" podStartSLOduration=2.611596585 podStartE2EDuration="18.885950971s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.120784904 +0000 UTC m=+1141.925002499" lastFinishedPulling="2026-03-11 01:13:21.39513928 +0000 UTC m=+1158.199356885" observedRunningTime="2026-03-11 01:13:21.881640597 +0000 UTC m=+1158.685858202" watchObservedRunningTime="2026-03-11 01:13:21.885950971 +0000 UTC m=+1158.690168576" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.597017 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-xh4cc" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.625078 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" podStartSLOduration=4.32936321 podStartE2EDuration="20.625042158s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.105429418 +0000 UTC m=+1141.909647023" lastFinishedPulling="2026-03-11 01:13:21.401108366 +0000 UTC m=+1158.205325971" observedRunningTime="2026-03-11 01:13:21.904377412 +0000 UTC m=+1158.708595047" watchObservedRunningTime="2026-03-11 01:13:23.625042158 +0000 UTC m=+1160.429259803" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.626743 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-7lh6p" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.634239 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-x5rps" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.663588 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-wgnqt" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.713674 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gd9tg" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.728064 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-pr4mx" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.763713 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-m6668" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.830195 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-c946t" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.887779 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" event={"ID":"daef3605-2bdc-4e16-b55f-61d2d3cfc2fd","Type":"ContainerStarted","Data":"e12e17dfe072feb403f32e8981fa5d2dca53e0a5c4205ec567768d9db8a6ee1b"} Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.888635 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.907652 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" podStartSLOduration=17.759533996000002 podStartE2EDuration="20.907633817s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:19.686177158 +0000 UTC m=+1156.490394763" lastFinishedPulling="2026-03-11 01:13:22.834276979 +0000 UTC m=+1159.638494584" observedRunningTime="2026-03-11 01:13:23.90388053 +0000 UTC m=+1160.708098135" watchObservedRunningTime="2026-03-11 01:13:23.907633817 +0000 UTC m=+1160.711851422" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.928652 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-6rtj8" Mar 11 01:13:23 crc kubenswrapper[4744]: I0311 01:13:23.976167 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-cp8jm" Mar 11 01:13:24 crc kubenswrapper[4744]: I0311 01:13:24.068996 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-fmvh8" Mar 11 01:13:24 crc kubenswrapper[4744]: I0311 01:13:24.108831 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wt7zj" Mar 11 01:13:24 crc kubenswrapper[4744]: I0311 01:13:24.181090 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-qrwcx" Mar 11 01:13:29 crc kubenswrapper[4744]: I0311 01:13:29.359395 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hxxgl" Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.953820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" event={"ID":"ffd0a9c9-75f4-4721-a045-b4f9dc285388","Type":"ContainerStarted","Data":"3448a9b53091979777e4c08da1ce1db3f98ac121dbcc57cee2f816b94946ecfb"} Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.954422 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.955301 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" event={"ID":"0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8","Type":"ContainerStarted","Data":"f4af36b2d235562f1df26c8e9da72a7fd402b23a6a5e7dec94ed82f214d495d6"} Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.955485 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.956564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" event={"ID":"5f30ecde-348a-43ce-980f-b27ebd7971bb","Type":"ContainerStarted","Data":"06e9e2b682e7cefaad71af874e2d520fce5f98b21e1a5b188abc014705fa9450"} Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.958237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" event={"ID":"c2d319ab-e093-4ff0-8a47-d73b7ffc8a68","Type":"ContainerStarted","Data":"ca4604a3438b1ad321510e81c4678ae70fb9efa08b7468e3e87465d200a03601"} Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.958424 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.959823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" event={"ID":"27f5807c-3d27-4ffc-bcc7-c43a07d04fa1","Type":"ContainerStarted","Data":"de64efaffc6922cf51e53654c0ae0b0c963f8cae748269478c81327d68f8bb03"} Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.959975 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:31 crc kubenswrapper[4744]: I0311 01:13:31.993889 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" podStartSLOduration=19.049792359 podStartE2EDuration="28.993869968s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:20.712446985 +0000 UTC m=+1157.516664590" lastFinishedPulling="2026-03-11 01:13:30.656524594 +0000 UTC m=+1167.460742199" observedRunningTime="2026-03-11 01:13:31.989044149 +0000 UTC m=+1168.793261774" watchObservedRunningTime="2026-03-11 01:13:31.993869968 +0000 UTC m=+1168.798087583" Mar 11 01:13:32 crc kubenswrapper[4744]: I0311 01:13:32.009148 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" podStartSLOduration=3.472015819 podStartE2EDuration="29.009127831s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.118987428 +0000 UTC m=+1141.923205033" lastFinishedPulling="2026-03-11 01:13:30.65609943 +0000 UTC m=+1167.460317045" observedRunningTime="2026-03-11 01:13:32.007035576 +0000 UTC m=+1168.811253181" watchObservedRunningTime="2026-03-11 01:13:32.009127831 +0000 UTC m=+1168.813345446" Mar 11 01:13:32 crc kubenswrapper[4744]: I0311 01:13:32.027675 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-924zq" podStartSLOduration=2.490222183 podStartE2EDuration="28.027648685s" podCreationTimestamp="2026-03-11 01:13:04 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.119130653 +0000 UTC m=+1141.923348258" lastFinishedPulling="2026-03-11 01:13:30.656557155 +0000 UTC m=+1167.460774760" observedRunningTime="2026-03-11 01:13:32.021882086 +0000 UTC m=+1168.826099701" watchObservedRunningTime="2026-03-11 01:13:32.027648685 +0000 UTC m=+1168.831866300" Mar 11 01:13:32 crc kubenswrapper[4744]: I0311 01:13:32.039065 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" podStartSLOduration=3.45048378 podStartE2EDuration="29.039050849s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.117563264 +0000 UTC m=+1141.921780869" lastFinishedPulling="2026-03-11 01:13:30.706130323 +0000 UTC m=+1167.510347938" observedRunningTime="2026-03-11 01:13:32.038729298 +0000 UTC m=+1168.842946903" watchObservedRunningTime="2026-03-11 01:13:32.039050849 +0000 UTC m=+1168.843268454" Mar 11 01:13:32 crc kubenswrapper[4744]: I0311 01:13:32.058742 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" podStartSLOduration=3.48246076 podStartE2EDuration="29.058723688s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="2026-03-11 01:13:05.117392538 +0000 UTC m=+1141.921610143" lastFinishedPulling="2026-03-11 01:13:30.693655466 +0000 UTC m=+1167.497873071" observedRunningTime="2026-03-11 01:13:32.056449798 +0000 UTC m=+1168.860667443" watchObservedRunningTime="2026-03-11 01:13:32.058723688 +0000 UTC m=+1168.862941293" Mar 11 01:13:33 crc kubenswrapper[4744]: I0311 01:13:33.944463 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gc9cm" Mar 11 01:13:34 crc kubenswrapper[4744]: I0311 01:13:34.181479 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-wk9q5" Mar 11 01:13:34 crc kubenswrapper[4744]: I0311 01:13:34.250427 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rrbl2" Mar 11 01:13:35 crc kubenswrapper[4744]: I0311 01:13:35.943042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:35 crc kubenswrapper[4744]: I0311 01:13:35.943685 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:35 crc kubenswrapper[4744]: I0311 01:13:35.952361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:35 crc kubenswrapper[4744]: I0311 01:13:35.953114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f341be86-aa09-4703-aa02-29ef571ad003-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-8trgb\" (UID: \"f341be86-aa09-4703-aa02-29ef571ad003\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:36 crc kubenswrapper[4744]: I0311 01:13:36.222481 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:36 crc kubenswrapper[4744]: I0311 01:13:36.552270 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb"] Mar 11 01:13:37 crc kubenswrapper[4744]: I0311 01:13:37.015434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" event={"ID":"f341be86-aa09-4703-aa02-29ef571ad003","Type":"ContainerStarted","Data":"6789d902fdb2c1c6bb33261fe779eb05b469c65da8542a9c8937116bd3edc053"} Mar 11 01:13:39 crc kubenswrapper[4744]: I0311 01:13:39.717565 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8pxr2" Mar 11 01:13:42 crc kubenswrapper[4744]: I0311 01:13:42.412363 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:13:42 crc kubenswrapper[4744]: I0311 01:13:42.413108 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:13:43 crc kubenswrapper[4744]: I0311 01:13:43.078484 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" event={"ID":"f341be86-aa09-4703-aa02-29ef571ad003","Type":"ContainerStarted","Data":"657cc056b83ebc1a8ba1ed18e33a1773bf99115f7f62cbbb73f2db149986c86b"} Mar 11 01:13:43 crc kubenswrapper[4744]: I0311 01:13:43.080834 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:13:43 crc kubenswrapper[4744]: I0311 01:13:43.143481 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" podStartSLOduration=40.143451608 podStartE2EDuration="40.143451608s" podCreationTimestamp="2026-03-11 01:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:13:43.125351877 +0000 UTC m=+1179.929569522" watchObservedRunningTime="2026-03-11 01:13:43.143451608 +0000 UTC m=+1179.947669253" Mar 11 01:13:44 crc kubenswrapper[4744]: I0311 01:13:44.134056 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h9rf2" Mar 11 01:13:44 crc kubenswrapper[4744]: I0311 01:13:44.153580 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-mgfnd" Mar 11 01:13:44 crc kubenswrapper[4744]: I0311 01:13:44.395101 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-qtvcf" Mar 11 01:13:56 crc kubenswrapper[4744]: I0311 01:13:56.233853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-8trgb" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.153395 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553194-5pbl5"] Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.155999 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.157401 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.158290 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.159489 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.160497 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553194-5pbl5"] Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.252150 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpd2k\" (UniqueName: \"kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k\") pod \"auto-csr-approver-29553194-5pbl5\" (UID: \"754f2936-6e2b-47b4-85c8-3c0e9db82b51\") " pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.354139 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpd2k\" (UniqueName: \"kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k\") pod \"auto-csr-approver-29553194-5pbl5\" (UID: \"754f2936-6e2b-47b4-85c8-3c0e9db82b51\") " pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.387546 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpd2k\" (UniqueName: \"kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k\") pod \"auto-csr-approver-29553194-5pbl5\" (UID: \"754f2936-6e2b-47b4-85c8-3c0e9db82b51\") " pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.487099 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:00 crc kubenswrapper[4744]: I0311 01:14:00.988421 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553194-5pbl5"] Mar 11 01:14:01 crc kubenswrapper[4744]: I0311 01:14:01.300228 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" event={"ID":"754f2936-6e2b-47b4-85c8-3c0e9db82b51","Type":"ContainerStarted","Data":"10ffb688c7c0c4ecee650ba755e6d7a4b37e0de881697a2c3d495bb3917cb5e6"} Mar 11 01:14:03 crc kubenswrapper[4744]: I0311 01:14:03.323655 4744 generic.go:334] "Generic (PLEG): container finished" podID="754f2936-6e2b-47b4-85c8-3c0e9db82b51" containerID="5411455bceeecea05cde5ae8ff02a93d3a0c41fb95c9dd82603494c2dfeb96a3" exitCode=0 Mar 11 01:14:03 crc kubenswrapper[4744]: I0311 01:14:03.323738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" event={"ID":"754f2936-6e2b-47b4-85c8-3c0e9db82b51","Type":"ContainerDied","Data":"5411455bceeecea05cde5ae8ff02a93d3a0c41fb95c9dd82603494c2dfeb96a3"} Mar 11 01:14:04 crc kubenswrapper[4744]: I0311 01:14:04.704020 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:04 crc kubenswrapper[4744]: I0311 01:14:04.826766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpd2k\" (UniqueName: \"kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k\") pod \"754f2936-6e2b-47b4-85c8-3c0e9db82b51\" (UID: \"754f2936-6e2b-47b4-85c8-3c0e9db82b51\") " Mar 11 01:14:04 crc kubenswrapper[4744]: I0311 01:14:04.832182 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k" (OuterVolumeSpecName: "kube-api-access-qpd2k") pod "754f2936-6e2b-47b4-85c8-3c0e9db82b51" (UID: "754f2936-6e2b-47b4-85c8-3c0e9db82b51"). InnerVolumeSpecName "kube-api-access-qpd2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:14:04 crc kubenswrapper[4744]: I0311 01:14:04.928665 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpd2k\" (UniqueName: \"kubernetes.io/projected/754f2936-6e2b-47b4-85c8-3c0e9db82b51-kube-api-access-qpd2k\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.341989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" event={"ID":"754f2936-6e2b-47b4-85c8-3c0e9db82b51","Type":"ContainerDied","Data":"10ffb688c7c0c4ecee650ba755e6d7a4b37e0de881697a2c3d495bb3917cb5e6"} Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.342040 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ffb688c7c0c4ecee650ba755e6d7a4b37e0de881697a2c3d495bb3917cb5e6" Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.342614 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553194-5pbl5" Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.787625 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553188-qb5tx"] Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.796134 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553188-qb5tx"] Mar 11 01:14:05 crc kubenswrapper[4744]: I0311 01:14:05.989889 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682147c3-ee15-4c94-801a-d40279ffbb5b" path="/var/lib/kubelet/pods/682147c3-ee15-4c94-801a-d40279ffbb5b/volumes" Mar 11 01:14:12 crc kubenswrapper[4744]: I0311 01:14:12.409697 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:14:12 crc kubenswrapper[4744]: I0311 01:14:12.410207 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:14:12 crc kubenswrapper[4744]: I0311 01:14:12.410245 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:14:12 crc kubenswrapper[4744]: I0311 01:14:12.410811 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:14:12 crc kubenswrapper[4744]: I0311 01:14:12.410860 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433" gracePeriod=600 Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.450177 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433" exitCode=0 Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.450607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433"} Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.450634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436"} Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.450651 4744 scope.go:117] "RemoveContainer" containerID="88beedd13bd5f368264b1a447a212f87b19111c8ac2dcc24499088c4608c67da" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.511538 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:13 crc kubenswrapper[4744]: E0311 01:14:13.512184 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754f2936-6e2b-47b4-85c8-3c0e9db82b51" containerName="oc" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.512207 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="754f2936-6e2b-47b4-85c8-3c0e9db82b51" containerName="oc" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.512345 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="754f2936-6e2b-47b4-85c8-3c0e9db82b51" containerName="oc" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.513085 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.517012 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.517279 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2bqcm" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.517465 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.517627 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.520019 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.556609 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.557987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.559914 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.569746 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.665505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx42n\" (UniqueName: \"kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.665644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.665687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.665773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.665835 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzdf5\" (UniqueName: \"kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.767616 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.768343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.768409 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzdf5\" (UniqueName: \"kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.768765 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx42n\" (UniqueName: \"kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.768802 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.768988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.769728 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.769784 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.785979 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx42n\" (UniqueName: \"kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n\") pod \"dnsmasq-dns-589db6c89c-m765l\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.786168 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzdf5\" (UniqueName: \"kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5\") pod \"dnsmasq-dns-86bbd886cf-kxs9t\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.836050 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:13 crc kubenswrapper[4744]: I0311 01:14:13.871385 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:14 crc kubenswrapper[4744]: I0311 01:14:14.157004 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:14 crc kubenswrapper[4744]: W0311 01:14:14.162832 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad35e961_10ed_45d2_bb0f_e12c4529d731.slice/crio-f6fc8980a4381de53a642cba9e715eb73266dac8d9b4c5c431d582e580fd32a8 WatchSource:0}: Error finding container f6fc8980a4381de53a642cba9e715eb73266dac8d9b4c5c431d582e580fd32a8: Status 404 returned error can't find the container with id f6fc8980a4381de53a642cba9e715eb73266dac8d9b4c5c431d582e580fd32a8 Mar 11 01:14:14 crc kubenswrapper[4744]: I0311 01:14:14.165018 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:14:14 crc kubenswrapper[4744]: I0311 01:14:14.315065 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:14 crc kubenswrapper[4744]: W0311 01:14:14.318613 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf0e402e_6c61_44db_bd73_c6ebf202a68b.slice/crio-d5f5c2b9b5d4f2e9e257856a4afec9a44d314b2567bf10e9815f0a3e3880c6da WatchSource:0}: Error finding container d5f5c2b9b5d4f2e9e257856a4afec9a44d314b2567bf10e9815f0a3e3880c6da: Status 404 returned error can't find the container with id d5f5c2b9b5d4f2e9e257856a4afec9a44d314b2567bf10e9815f0a3e3880c6da Mar 11 01:14:14 crc kubenswrapper[4744]: I0311 01:14:14.456571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-m765l" event={"ID":"af0e402e-6c61-44db-bd73-c6ebf202a68b","Type":"ContainerStarted","Data":"d5f5c2b9b5d4f2e9e257856a4afec9a44d314b2567bf10e9815f0a3e3880c6da"} Mar 11 01:14:14 crc kubenswrapper[4744]: I0311 01:14:14.457758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" event={"ID":"ad35e961-10ed-45d2-bb0f-e12c4529d731","Type":"ContainerStarted","Data":"f6fc8980a4381de53a642cba9e715eb73266dac8d9b4c5c431d582e580fd32a8"} Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.302920 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.321754 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.322895 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.334900 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.404336 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.404381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.404457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85kl\" (UniqueName: \"kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.510695 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.510749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.510854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t85kl\" (UniqueName: \"kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.512162 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.514748 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.553091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85kl\" (UniqueName: \"kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl\") pod \"dnsmasq-dns-78cb4465c9-65zms\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.580302 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.595326 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.597202 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.608626 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.652840 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.713625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.713672 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.713741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pchgr\" (UniqueName: \"kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.814689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pchgr\" (UniqueName: \"kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.814979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.815011 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.819578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.819949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.835076 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pchgr\" (UniqueName: \"kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr\") pod \"dnsmasq-dns-7c47bcb9f9-pm2vk\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:16 crc kubenswrapper[4744]: I0311 01:14:16.938579 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.109502 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.171989 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.458845 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.460068 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.464372 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.464577 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.464766 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.464958 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.465041 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.465596 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-h2n7r" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.465784 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.472633 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.512328 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" event={"ID":"4a64957b-bca3-45be-bc38-637feaf98714","Type":"ContainerStarted","Data":"c8dc255555836df0c482f96e0bd6b4d9ec8b641354a236257899bbbccd957d4c"} Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.513750 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" event={"ID":"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967","Type":"ContainerStarted","Data":"da4b42b3c5d0bfc6e9d1cc86c46ae5346d3a3458edbe62941e4d55e8b601755c"} Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524626 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkczr\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524725 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524783 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524800 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.524979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.525070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.525105 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.525190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626220 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626282 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626305 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626376 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkczr\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626442 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.626471 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.628089 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.628374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.629857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.630851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.631511 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.632215 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.632708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.634699 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.640184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.646596 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.648460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkczr\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.649340 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.716701 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.717991 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.720155 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.720961 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.721133 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.721670 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.721743 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.721858 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qsbfp" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.724001 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.748373 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.792252 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.829845 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.829907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.829950 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.829969 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.829990 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7w2\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830058 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830346 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830642 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830767 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830806 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.830834 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935352 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935379 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935415 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935454 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7w2\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.935582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.936437 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.937641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.938507 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.938671 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.938786 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.939107 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.945375 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.945552 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.946184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.947270 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.958558 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7w2\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:17 crc kubenswrapper[4744]: I0311 01:14:17.986983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " pod="openstack/rabbitmq-server-0" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.052863 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.343825 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:14:18 crc kubenswrapper[4744]: W0311 01:14:18.367174 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714c91e5_04c5_4f95_97e3_a3c08664944d.slice/crio-b08670cddf5cc4a5ba911f41e6483fd3df91a1dd8c03560914da4bff08006f78 WatchSource:0}: Error finding container b08670cddf5cc4a5ba911f41e6483fd3df91a1dd8c03560914da4bff08006f78: Status 404 returned error can't find the container with id b08670cddf5cc4a5ba911f41e6483fd3df91a1dd8c03560914da4bff08006f78 Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.502046 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:14:18 crc kubenswrapper[4744]: W0311 01:14:18.517277 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb920bf7_eae5_4f7a_9af7_bde85bfb4ee9.slice/crio-fd34d5b21cf096881733f60c0d418472d3d5afcd53181014727879c731dbaa67 WatchSource:0}: Error finding container fd34d5b21cf096881733f60c0d418472d3d5afcd53181014727879c731dbaa67: Status 404 returned error can't find the container with id fd34d5b21cf096881733f60c0d418472d3d5afcd53181014727879c731dbaa67 Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.522535 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerStarted","Data":"b08670cddf5cc4a5ba911f41e6483fd3df91a1dd8c03560914da4bff08006f78"} Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.902719 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.904370 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.936015 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.971380 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.971600 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.972003 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-znjfv" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.972711 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 11 01:14:18 crc kubenswrapper[4744]: I0311 01:14:18.978449 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054289 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxgt6\" (UniqueName: \"kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054506 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054554 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054583 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.054609 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxgt6\" (UniqueName: \"kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155600 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155673 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155801 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.155826 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.157092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.157325 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.159811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.160248 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.170108 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.174111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.175255 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.179409 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxgt6\" (UniqueName: \"kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.198540 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.265599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 01:14:19 crc kubenswrapper[4744]: I0311 01:14:19.532717 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerStarted","Data":"fd34d5b21cf096881733f60c0d418472d3d5afcd53181014727879c731dbaa67"} Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.380023 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.381300 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.386063 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.386776 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5kxxs" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.386905 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.387009 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.387284 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.478882 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.478949 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479009 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479123 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479228 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479259 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqpw\" (UniqueName: \"kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479359 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.479441 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.569625 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.570868 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.573901 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.574166 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vxq6b" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.574394 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580705 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580752 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580788 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580807 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqpw\" (UniqueName: \"kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580834 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580894 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.580937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.582936 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.583166 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.583550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.583811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.585865 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.588715 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.596417 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.597375 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.615361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqpw\" (UniqueName: \"kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.615488 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.682125 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8djm\" (UniqueName: \"kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.682458 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.682530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.682552 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.682608 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.720898 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.788275 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.788323 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.788383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.788422 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8djm\" (UniqueName: \"kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.788439 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.789261 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.790234 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.793448 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.795062 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.805143 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8djm\" (UniqueName: \"kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm\") pod \"memcached-0\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " pod="openstack/memcached-0" Mar 11 01:14:20 crc kubenswrapper[4744]: I0311 01:14:20.897365 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.589359 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.590181 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.594275 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cqf6g" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.603224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.738874 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pfp7\" (UniqueName: \"kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7\") pod \"kube-state-metrics-0\" (UID: \"dbc15fc8-0b29-47ad-9ce1-38097df24920\") " pod="openstack/kube-state-metrics-0" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.840815 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pfp7\" (UniqueName: \"kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7\") pod \"kube-state-metrics-0\" (UID: \"dbc15fc8-0b29-47ad-9ce1-38097df24920\") " pod="openstack/kube-state-metrics-0" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.862810 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pfp7\" (UniqueName: \"kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7\") pod \"kube-state-metrics-0\" (UID: \"dbc15fc8-0b29-47ad-9ce1-38097df24920\") " pod="openstack/kube-state-metrics-0" Mar 11 01:14:22 crc kubenswrapper[4744]: I0311 01:14:22.942745 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.545850 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.547168 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.549449 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.552835 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.553179 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.554038 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.554329 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dw68n" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.559401 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.638215 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.639431 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.643660 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.645799 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.645965 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wps86" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.646129 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.646145 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.669306 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.677963 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726155 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726201 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726234 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726266 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726297 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726321 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgd8d\" (UniqueName: \"kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcdj\" (UniqueName: \"kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726388 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726406 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726431 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726551 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.726593 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827621 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827667 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgd8d\" (UniqueName: \"kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827748 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827806 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827827 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827845 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827874 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827890 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827908 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827927 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827944 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827963 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.827982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828019 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcdj\" (UniqueName: \"kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828055 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtmgl\" (UniqueName: \"kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828290 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828671 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828745 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.828960 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.829374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.829984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.833889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.833986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.834085 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.834681 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.837171 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.844798 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcdj\" (UniqueName: \"kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj\") pod \"ovn-controller-2mjl7\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.846990 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.865550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgd8d\" (UniqueName: \"kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d\") pod \"ovsdbserver-nb-0\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.868237 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.931503 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtmgl\" (UniqueName: \"kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.931819 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.931891 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932088 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932192 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.932608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.933886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.946474 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtmgl\" (UniqueName: \"kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl\") pod \"ovn-controller-ovs-88ffp\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.964853 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:26 crc kubenswrapper[4744]: I0311 01:14:26.970330 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.004257 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.005835 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.009107 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.010352 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.010411 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-btsl8" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.010521 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.028864 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084167 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084279 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084392 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084725 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gj26\" (UniqueName: \"kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.084919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.085115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187127 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187221 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187261 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gj26\" (UniqueName: \"kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187570 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.187985 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.188184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.188599 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.189001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.193747 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.208242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.208291 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.212118 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gj26\" (UniqueName: \"kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.216185 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:30 crc kubenswrapper[4744]: I0311 01:14:30.331371 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.118543 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.118865 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pchgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-pm2vk_openstack(1a8fbf5b-fd33-4cff-a0bd-a40562fe3967): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.120034 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" podUID="1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.176569 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.176708 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mx42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-m765l_openstack(af0e402e-6c61-44db-bd73-c6ebf202a68b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.177863 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-m765l" podUID="af0e402e-6c61-44db-bd73-c6ebf202a68b" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.180130 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.180323 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t85kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-65zms_openstack(4a64957b-bca3-45be-bc38-637feaf98714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.181462 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" podUID="4a64957b-bca3-45be-bc38-637feaf98714" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.190726 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.190996 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzdf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-kxs9t_openstack(ad35e961-10ed-45d2-bb0f-e12c4529d731): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.192645 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" podUID="ad35e961-10ed-45d2-bb0f-e12c4529d731" Mar 11 01:14:39 crc kubenswrapper[4744]: W0311 01:14:39.682863 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod730c901d_c3c5_46c5_b618_00cdcc17bef2.slice/crio-1b26f542b992a0c808fd6085d2325d9b6aa8290cf4c2b4d0581787fb7abed8f0 WatchSource:0}: Error finding container 1b26f542b992a0c808fd6085d2325d9b6aa8290cf4c2b4d0581787fb7abed8f0: Status 404 returned error can't find the container with id 1b26f542b992a0c808fd6085d2325d9b6aa8290cf4c2b4d0581787fb7abed8f0 Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.688160 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.702460 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.713039 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.727076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerStarted","Data":"ac3725eaefc0d00cf9a08b71208875c6bd6da5cb355883a516a0d2d44c9a7b9d"} Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.728960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbc15fc8-0b29-47ad-9ce1-38097df24920","Type":"ContainerStarted","Data":"bfe213e4876d0e50b49fe6b5cad33d848045c2de73079970d00b4c5a897f47d0"} Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.730276 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"730c901d-c3c5-46c5-b618-00cdcc17bef2","Type":"ContainerStarted","Data":"1b26f542b992a0c808fd6085d2325d9b6aa8290cf4c2b4d0581787fb7abed8f0"} Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.732342 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" podUID="4a64957b-bca3-45be-bc38-637feaf98714" Mar 11 01:14:39 crc kubenswrapper[4744]: E0311 01:14:39.732553 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" podUID="1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.790370 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:14:39 crc kubenswrapper[4744]: W0311 01:14:39.798956 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca73c89_992f_4b36_9a70_5d67bace9cd2.slice/crio-fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863 WatchSource:0}: Error finding container fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863: Status 404 returned error can't find the container with id fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863 Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.893656 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:14:39 crc kubenswrapper[4744]: I0311 01:14:39.898921 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.106941 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:14:40 crc kubenswrapper[4744]: W0311 01:14:40.109435 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c4f09f_eb97_40a0_b06c_80a5a922c986.slice/crio-c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719 WatchSource:0}: Error finding container c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719: Status 404 returned error can't find the container with id c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719 Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.121343 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:40 crc kubenswrapper[4744]: W0311 01:14:40.184183 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee48ea6_67ea_4da3_af92_82b9d0e5b67d.slice/crio-b4de8d22e0b094669f0815f7f5bada4bfca5e329c693edd03a9580de0e2c958e WatchSource:0}: Error finding container b4de8d22e0b094669f0815f7f5bada4bfca5e329c693edd03a9580de0e2c958e: Status 404 returned error can't find the container with id b4de8d22e0b094669f0815f7f5bada4bfca5e329c693edd03a9580de0e2c958e Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.185270 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.201759 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.267889 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc\") pod \"ad35e961-10ed-45d2-bb0f-e12c4529d731\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.267979 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzdf5\" (UniqueName: \"kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5\") pod \"ad35e961-10ed-45d2-bb0f-e12c4529d731\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.268057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx42n\" (UniqueName: \"kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n\") pod \"af0e402e-6c61-44db-bd73-c6ebf202a68b\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.268099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config\") pod \"ad35e961-10ed-45d2-bb0f-e12c4529d731\" (UID: \"ad35e961-10ed-45d2-bb0f-e12c4529d731\") " Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.268317 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config\") pod \"af0e402e-6c61-44db-bd73-c6ebf202a68b\" (UID: \"af0e402e-6c61-44db-bd73-c6ebf202a68b\") " Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.268689 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad35e961-10ed-45d2-bb0f-e12c4529d731" (UID: "ad35e961-10ed-45d2-bb0f-e12c4529d731"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.268984 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.269066 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config" (OuterVolumeSpecName: "config") pod "ad35e961-10ed-45d2-bb0f-e12c4529d731" (UID: "ad35e961-10ed-45d2-bb0f-e12c4529d731"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.269694 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config" (OuterVolumeSpecName: "config") pod "af0e402e-6c61-44db-bd73-c6ebf202a68b" (UID: "af0e402e-6c61-44db-bd73-c6ebf202a68b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.273598 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5" (OuterVolumeSpecName: "kube-api-access-xzdf5") pod "ad35e961-10ed-45d2-bb0f-e12c4529d731" (UID: "ad35e961-10ed-45d2-bb0f-e12c4529d731"). InnerVolumeSpecName "kube-api-access-xzdf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.273707 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n" (OuterVolumeSpecName: "kube-api-access-mx42n") pod "af0e402e-6c61-44db-bd73-c6ebf202a68b" (UID: "af0e402e-6c61-44db-bd73-c6ebf202a68b"). InnerVolumeSpecName "kube-api-access-mx42n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.370352 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx42n\" (UniqueName: \"kubernetes.io/projected/af0e402e-6c61-44db-bd73-c6ebf202a68b-kube-api-access-mx42n\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.370402 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad35e961-10ed-45d2-bb0f-e12c4529d731-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.370448 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af0e402e-6c61-44db-bd73-c6ebf202a68b-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.370466 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzdf5\" (UniqueName: \"kubernetes.io/projected/ad35e961-10ed-45d2-bb0f-e12c4529d731-kube-api-access-xzdf5\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.738319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerStarted","Data":"d87191ac09c3db8a6f76e7da327659cedbea082e3b6af5f2072d6a56758695f2"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.739904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerStarted","Data":"e724fad610e3cb354b224dbc23638db68990df9c737ed272890fd1779688fc45"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.742160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerStarted","Data":"fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.744953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerStarted","Data":"c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.746365 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-m765l" event={"ID":"af0e402e-6c61-44db-bd73-c6ebf202a68b","Type":"ContainerDied","Data":"d5f5c2b9b5d4f2e9e257856a4afec9a44d314b2567bf10e9815f0a3e3880c6da"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.746447 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-m765l" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.751928 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7" event={"ID":"fe2603a1-fdea-44d4-8188-f5f93324575c","Type":"ContainerStarted","Data":"a39b7ab94f8b38ae1a9f49a5c072232b53543de80f123e4ecc83052d77c4e49c"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.752729 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" event={"ID":"ad35e961-10ed-45d2-bb0f-e12c4529d731","Type":"ContainerDied","Data":"f6fc8980a4381de53a642cba9e715eb73266dac8d9b4c5c431d582e580fd32a8"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.752794 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-kxs9t" Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.758789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerStarted","Data":"b4de8d22e0b094669f0815f7f5bada4bfca5e329c693edd03a9580de0e2c958e"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.795246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerStarted","Data":"dd9f74256d9d36d7d93b0c687c50126a3012a98632ea369b61ca6eb2ada71f31"} Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.819569 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.836869 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-m765l"] Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.868431 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:40 crc kubenswrapper[4744]: I0311 01:14:40.873657 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-kxs9t"] Mar 11 01:14:40 crc kubenswrapper[4744]: E0311 01:14:40.943156 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad35e961_10ed_45d2_bb0f_e12c4529d731.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf0e402e_6c61_44db_bd73_c6ebf202a68b.slice\": RecentStats: unable to find data in memory cache]" Mar 11 01:14:41 crc kubenswrapper[4744]: I0311 01:14:41.983710 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad35e961-10ed-45d2-bb0f-e12c4529d731" path="/var/lib/kubelet/pods/ad35e961-10ed-45d2-bb0f-e12c4529d731/volumes" Mar 11 01:14:41 crc kubenswrapper[4744]: I0311 01:14:41.984106 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af0e402e-6c61-44db-bd73-c6ebf202a68b" path="/var/lib/kubelet/pods/af0e402e-6c61-44db-bd73-c6ebf202a68b/volumes" Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.875005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"730c901d-c3c5-46c5-b618-00cdcc17bef2","Type":"ContainerStarted","Data":"041f61378d1b6288e78cda19110d0e5a7b5f6590d353d62737e351291c23ef4b"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.875421 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.878858 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerID="aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6" exitCode=0 Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.879323 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerDied","Data":"aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.893828 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.724992022 podStartE2EDuration="27.893805751s" podCreationTimestamp="2026-03-11 01:14:20 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.687574475 +0000 UTC m=+1236.491792080" lastFinishedPulling="2026-03-11 01:14:46.856388204 +0000 UTC m=+1243.660605809" observedRunningTime="2026-03-11 01:14:47.890451427 +0000 UTC m=+1244.694669042" watchObservedRunningTime="2026-03-11 01:14:47.893805751 +0000 UTC m=+1244.698023376" Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.908847 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerStarted","Data":"9d84530f46578e0e43683833f9c6f2305fe1b6a881fac74e31a0651183712b14"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.912254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerStarted","Data":"21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.915461 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerStarted","Data":"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.922635 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7" event={"ID":"fe2603a1-fdea-44d4-8188-f5f93324575c","Type":"ContainerStarted","Data":"50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.923275 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2mjl7" Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.925910 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerStarted","Data":"f17d6132fb9aad59781a44370c2084f529634cf0ee10583babf1eb3b469f6924"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.928486 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbc15fc8-0b29-47ad-9ce1-38097df24920","Type":"ContainerStarted","Data":"d75b57ad203a91708de204fd623e2cbc3a859d6cce5a5c47588de99db8116945"} Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.929006 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 01:14:47 crc kubenswrapper[4744]: I0311 01:14:47.972731 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.879632556 podStartE2EDuration="25.972708617s" podCreationTimestamp="2026-03-11 01:14:22 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.711774086 +0000 UTC m=+1236.515991691" lastFinishedPulling="2026-03-11 01:14:46.804850147 +0000 UTC m=+1243.609067752" observedRunningTime="2026-03-11 01:14:47.965221744 +0000 UTC m=+1244.769439359" watchObservedRunningTime="2026-03-11 01:14:47.972708617 +0000 UTC m=+1244.776926242" Mar 11 01:14:48 crc kubenswrapper[4744]: I0311 01:14:48.012792 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mjl7" podStartSLOduration=15.030307505 podStartE2EDuration="22.012775408s" podCreationTimestamp="2026-03-11 01:14:26 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.877702769 +0000 UTC m=+1236.681920374" lastFinishedPulling="2026-03-11 01:14:46.860170672 +0000 UTC m=+1243.664388277" observedRunningTime="2026-03-11 01:14:48.009182938 +0000 UTC m=+1244.813400543" watchObservedRunningTime="2026-03-11 01:14:48.012775408 +0000 UTC m=+1244.816993013" Mar 11 01:14:48 crc kubenswrapper[4744]: I0311 01:14:48.942823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerStarted","Data":"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144"} Mar 11 01:14:48 crc kubenswrapper[4744]: I0311 01:14:48.943321 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerStarted","Data":"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05"} Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.925367 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-88ffp" podStartSLOduration=17.312681862 podStartE2EDuration="23.925340482s" podCreationTimestamp="2026-03-11 01:14:26 +0000 UTC" firstStartedPulling="2026-03-11 01:14:40.186194151 +0000 UTC m=+1236.990411756" lastFinishedPulling="2026-03-11 01:14:46.798852771 +0000 UTC m=+1243.603070376" observedRunningTime="2026-03-11 01:14:48.975213931 +0000 UTC m=+1245.779431586" watchObservedRunningTime="2026-03-11 01:14:49.925340482 +0000 UTC m=+1246.729558097" Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.935816 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.937029 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.945125 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.961570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:49 crc kubenswrapper[4744]: I0311 01:14:49.961973 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.002091 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043274 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043343 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh7rw\" (UniqueName: \"kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043380 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043404 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043421 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.043452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.144456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145262 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh7rw\" (UniqueName: \"kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.145952 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.146205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.146429 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.151185 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.154114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.174541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh7rw\" (UniqueName: \"kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw\") pod \"ovn-controller-metrics-nrcjs\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.235562 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.261076 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.262229 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.267366 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.281867 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.287616 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.443415 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.453497 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.453559 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8qmr\" (UniqueName: \"kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.453579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.453619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.504553 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.505856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.507835 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.513941 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.557994 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.558398 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.558461 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8qmr\" (UniqueName: \"kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.558497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.560231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.560933 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.562653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.582459 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8qmr\" (UniqueName: \"kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr\") pod \"dnsmasq-dns-b47ddbdf5-vpw7m\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.587291 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.661908 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.662276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.662396 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmw94\" (UniqueName: \"kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.662421 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.662472 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.764503 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmw94\" (UniqueName: \"kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.764564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.764610 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.764690 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.765723 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.765725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.766210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.765080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.766353 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.784123 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmw94\" (UniqueName: \"kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94\") pod \"dnsmasq-dns-659ddb758c-fzrlb\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.842227 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.973979 4744 generic.go:334] "Generic (PLEG): container finished" podID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerID="c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454" exitCode=0 Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.974085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerDied","Data":"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454"} Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.978374 4744 generic.go:334] "Generic (PLEG): container finished" podID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerID="f17d6132fb9aad59781a44370c2084f529634cf0ee10583babf1eb3b469f6924" exitCode=0 Mar 11 01:14:50 crc kubenswrapper[4744]: I0311 01:14:50.978455 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerDied","Data":"f17d6132fb9aad59781a44370c2084f529634cf0ee10583babf1eb3b469f6924"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.438215 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.446065 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.582614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc\") pod \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.582944 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc\") pod \"4a64957b-bca3-45be-bc38-637feaf98714\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.582973 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config\") pod \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583002 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pchgr\" (UniqueName: \"kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr\") pod \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\" (UID: \"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583023 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t85kl\" (UniqueName: \"kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl\") pod \"4a64957b-bca3-45be-bc38-637feaf98714\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583329 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" (UID: "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583377 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config\") pod \"4a64957b-bca3-45be-bc38-637feaf98714\" (UID: \"4a64957b-bca3-45be-bc38-637feaf98714\") " Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583747 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583871 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config" (OuterVolumeSpecName: "config") pod "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" (UID: "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.583948 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a64957b-bca3-45be-bc38-637feaf98714" (UID: "4a64957b-bca3-45be-bc38-637feaf98714"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.584140 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config" (OuterVolumeSpecName: "config") pod "4a64957b-bca3-45be-bc38-637feaf98714" (UID: "4a64957b-bca3-45be-bc38-637feaf98714"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.589046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl" (OuterVolumeSpecName: "kube-api-access-t85kl") pod "4a64957b-bca3-45be-bc38-637feaf98714" (UID: "4a64957b-bca3-45be-bc38-637feaf98714"). InnerVolumeSpecName "kube-api-access-t85kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.589896 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr" (OuterVolumeSpecName: "kube-api-access-pchgr") pod "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" (UID: "1a8fbf5b-fd33-4cff-a0bd-a40562fe3967"). InnerVolumeSpecName "kube-api-access-pchgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.686401 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.686432 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.686442 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t85kl\" (UniqueName: \"kubernetes.io/projected/4a64957b-bca3-45be-bc38-637feaf98714-kube-api-access-t85kl\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.686456 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pchgr\" (UniqueName: \"kubernetes.io/projected/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967-kube-api-access-pchgr\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.686467 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a64957b-bca3-45be-bc38-637feaf98714-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.845121 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.892987 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.894591 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:14:51 crc kubenswrapper[4744]: W0311 01:14:51.898993 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebd1c76c_75f8_411f_9350_a0e31f1721cd.slice/crio-fe98758294a82d15794b05005d068b1924d78b40d0c6f694c099b78f635c4589 WatchSource:0}: Error finding container fe98758294a82d15794b05005d068b1924d78b40d0c6f694c099b78f635c4589: Status 404 returned error can't find the container with id fe98758294a82d15794b05005d068b1924d78b40d0c6f694c099b78f635c4589 Mar 11 01:14:51 crc kubenswrapper[4744]: W0311 01:14:51.900860 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9410349_9a4f_42f6_81e4_dd4dd66abdcc.slice/crio-1fc6e944d84dc837a2f5776dc7a3074d088d374ca33393f450196c47b4775d31 WatchSource:0}: Error finding container 1fc6e944d84dc837a2f5776dc7a3074d088d374ca33393f450196c47b4775d31: Status 404 returned error can't find the container with id 1fc6e944d84dc837a2f5776dc7a3074d088d374ca33393f450196c47b4775d31 Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.987549 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerStarted","Data":"223e84951f31263619072ddbe2e0f719244b2be18159a6ed4fd2baa8d2a4de50"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.989211 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" event={"ID":"4a64957b-bca3-45be-bc38-637feaf98714","Type":"ContainerDied","Data":"c8dc255555836df0c482f96e0bd6b4d9ec8b641354a236257899bbbccd957d4c"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.989228 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-65zms" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.992233 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" event={"ID":"1a8fbf5b-fd33-4cff-a0bd-a40562fe3967","Type":"ContainerDied","Data":"da4b42b3c5d0bfc6e9d1cc86c46ae5346d3a3458edbe62941e4d55e8b601755c"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.992243 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-pm2vk" Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.994654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" event={"ID":"c9410349-9a4f-42f6-81e4-dd4dd66abdcc","Type":"ContainerStarted","Data":"1fc6e944d84dc837a2f5776dc7a3074d088d374ca33393f450196c47b4775d31"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.996537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerStarted","Data":"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28"} Mar 11 01:14:51 crc kubenswrapper[4744]: I0311 01:14:51.998851 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerStarted","Data":"364c3fa121dddb8ffc883e3753ab0a397a68bab1a3f4f81bef538a6bf6da07b9"} Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.002160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerStarted","Data":"7fd05cd78a9ef3fd32c5308b2e3464c3c04f1b11a481396b59397efe66dc1c90"} Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.005981 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" event={"ID":"102a3691-340d-4bed-b87f-7bebbdb1f819","Type":"ContainerStarted","Data":"827ab222e296817a2f655d1a990685bbc0af31ce8098f996b8aeca7087e9f29a"} Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.006904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrcjs" event={"ID":"ebd1c76c-75f8-411f-9350-a0e31f1721cd","Type":"ContainerStarted","Data":"fe98758294a82d15794b05005d068b1924d78b40d0c6f694c099b78f635c4589"} Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.016663 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.388466922 podStartE2EDuration="24.016638045s" podCreationTimestamp="2026-03-11 01:14:28 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.800744594 +0000 UTC m=+1236.604962199" lastFinishedPulling="2026-03-11 01:14:51.428915707 +0000 UTC m=+1248.233133322" observedRunningTime="2026-03-11 01:14:52.009973909 +0000 UTC m=+1248.814191514" watchObservedRunningTime="2026-03-11 01:14:52.016638045 +0000 UTC m=+1248.820855650" Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.041684 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.044702009 podStartE2EDuration="35.04165323s" podCreationTimestamp="2026-03-11 01:14:17 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.859486654 +0000 UTC m=+1236.663704260" lastFinishedPulling="2026-03-11 01:14:46.856437876 +0000 UTC m=+1243.660655481" observedRunningTime="2026-03-11 01:14:52.040741162 +0000 UTC m=+1248.844958767" watchObservedRunningTime="2026-03-11 01:14:52.04165323 +0000 UTC m=+1248.845870835" Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.061278 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.773459094 podStartE2EDuration="27.061264708s" podCreationTimestamp="2026-03-11 01:14:25 +0000 UTC" firstStartedPulling="2026-03-11 01:14:40.11194575 +0000 UTC m=+1236.916163355" lastFinishedPulling="2026-03-11 01:14:51.399751364 +0000 UTC m=+1248.203968969" observedRunningTime="2026-03-11 01:14:52.057135411 +0000 UTC m=+1248.861353016" watchObservedRunningTime="2026-03-11 01:14:52.061264708 +0000 UTC m=+1248.865482313" Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.085644 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.993788510999998 podStartE2EDuration="33.085626494s" podCreationTimestamp="2026-03-11 01:14:19 +0000 UTC" firstStartedPulling="2026-03-11 01:14:39.717502193 +0000 UTC m=+1236.521719798" lastFinishedPulling="2026-03-11 01:14:46.809340166 +0000 UTC m=+1243.613557781" observedRunningTime="2026-03-11 01:14:52.082494667 +0000 UTC m=+1248.886712272" watchObservedRunningTime="2026-03-11 01:14:52.085626494 +0000 UTC m=+1248.889844099" Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.119100 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.127893 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-65zms"] Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.179456 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.181553 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-pm2vk"] Mar 11 01:14:52 crc kubenswrapper[4744]: I0311 01:14:52.946896 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.017240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrcjs" event={"ID":"ebd1c76c-75f8-411f-9350-a0e31f1721cd","Type":"ContainerStarted","Data":"a76d8dbc969a328faa9315afa2eb3f3d73314211ba9392a01e5df03bb7391b1e"} Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.018993 4744 generic.go:334] "Generic (PLEG): container finished" podID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerID="3af06364499ebb35c2bd4fdb456a24fb081e41985deb636fdcb20b80b655adea" exitCode=0 Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.019082 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" event={"ID":"c9410349-9a4f-42f6-81e4-dd4dd66abdcc","Type":"ContainerDied","Data":"3af06364499ebb35c2bd4fdb456a24fb081e41985deb636fdcb20b80b655adea"} Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.020555 4744 generic.go:334] "Generic (PLEG): container finished" podID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerID="805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c" exitCode=0 Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.020845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" event={"ID":"102a3691-340d-4bed-b87f-7bebbdb1f819","Type":"ContainerDied","Data":"805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c"} Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.068713 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nrcjs" podStartSLOduration=4.068694785 podStartE2EDuration="4.068694785s" podCreationTimestamp="2026-03-11 01:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:14:53.035079403 +0000 UTC m=+1249.839297018" watchObservedRunningTime="2026-03-11 01:14:53.068694785 +0000 UTC m=+1249.872912390" Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.871805 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.922953 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.986495 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8fbf5b-fd33-4cff-a0bd-a40562fe3967" path="/var/lib/kubelet/pods/1a8fbf5b-fd33-4cff-a0bd-a40562fe3967/volumes" Mar 11 01:14:53 crc kubenswrapper[4744]: I0311 01:14:53.986880 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a64957b-bca3-45be-bc38-637feaf98714" path="/var/lib/kubelet/pods/4a64957b-bca3-45be-bc38-637feaf98714/volumes" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.028275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" event={"ID":"102a3691-340d-4bed-b87f-7bebbdb1f819","Type":"ContainerStarted","Data":"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db"} Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.028683 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.031706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" event={"ID":"c9410349-9a4f-42f6-81e4-dd4dd66abdcc","Type":"ContainerStarted","Data":"2b406a55e25e1a06bba16de39922948d60d13a247841bfd1c50d5a47c7f6510a"} Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.031758 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.031774 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.057586 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" podStartSLOduration=3.615396429 podStartE2EDuration="4.057493234s" podCreationTimestamp="2026-03-11 01:14:50 +0000 UTC" firstStartedPulling="2026-03-11 01:14:51.862188557 +0000 UTC m=+1248.666406162" lastFinishedPulling="2026-03-11 01:14:52.304285352 +0000 UTC m=+1249.108502967" observedRunningTime="2026-03-11 01:14:54.054535003 +0000 UTC m=+1250.858752618" watchObservedRunningTime="2026-03-11 01:14:54.057493234 +0000 UTC m=+1250.861710839" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.075161 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" podStartSLOduration=3.627326801 podStartE2EDuration="4.075144762s" podCreationTimestamp="2026-03-11 01:14:50 +0000 UTC" firstStartedPulling="2026-03-11 01:14:51.903572991 +0000 UTC m=+1248.707790596" lastFinishedPulling="2026-03-11 01:14:52.351390952 +0000 UTC m=+1249.155608557" observedRunningTime="2026-03-11 01:14:54.06832608 +0000 UTC m=+1250.872543695" watchObservedRunningTime="2026-03-11 01:14:54.075144762 +0000 UTC m=+1250.879362367" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.077649 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.332524 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:54 crc kubenswrapper[4744]: I0311 01:14:54.381869 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:54 crc kubenswrapper[4744]: E0311 01:14:54.727650 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.58:56618->38.102.83.58:46419: write tcp 38.102.83.58:56618->38.102.83.58:46419: write: broken pipe Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.036837 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.077459 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.251120 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.252310 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.254072 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.254132 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.254144 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.260369 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r6kgc" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.265632 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341882 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341909 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341931 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbmgd\" (UniqueName: \"kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.341994 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.342013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.443836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.443958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444000 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444029 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbmgd\" (UniqueName: \"kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444068 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444145 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444187 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444607 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.444830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.449173 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.449774 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.450683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.459617 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbmgd\" (UniqueName: \"kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd\") pod \"ovn-northd-0\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.600623 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 01:14:55 crc kubenswrapper[4744]: I0311 01:14:55.899539 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 11 01:14:56 crc kubenswrapper[4744]: W0311 01:14:56.044985 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f48c5b8_9cda_4c2c_9244_9bb71b9dc05f.slice/crio-bbcdff835555aeb6b4dbb296e3fd0876b11930670cbee761696cab7a332be634 WatchSource:0}: Error finding container bbcdff835555aeb6b4dbb296e3fd0876b11930670cbee761696cab7a332be634: Status 404 returned error can't find the container with id bbcdff835555aeb6b4dbb296e3fd0876b11930670cbee761696cab7a332be634 Mar 11 01:14:56 crc kubenswrapper[4744]: I0311 01:14:56.045971 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:14:57 crc kubenswrapper[4744]: I0311 01:14:57.052248 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerStarted","Data":"bbcdff835555aeb6b4dbb296e3fd0876b11930670cbee761696cab7a332be634"} Mar 11 01:14:58 crc kubenswrapper[4744]: I0311 01:14:58.066504 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerStarted","Data":"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b"} Mar 11 01:14:58 crc kubenswrapper[4744]: I0311 01:14:58.067144 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerStarted","Data":"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202"} Mar 11 01:14:58 crc kubenswrapper[4744]: I0311 01:14:58.067175 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 11 01:14:58 crc kubenswrapper[4744]: I0311 01:14:58.094705 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.815469303 podStartE2EDuration="3.094688874s" podCreationTimestamp="2026-03-11 01:14:55 +0000 UTC" firstStartedPulling="2026-03-11 01:14:56.046829008 +0000 UTC m=+1252.851046623" lastFinishedPulling="2026-03-11 01:14:57.326048599 +0000 UTC m=+1254.130266194" observedRunningTime="2026-03-11 01:14:58.089575846 +0000 UTC m=+1254.893793451" watchObservedRunningTime="2026-03-11 01:14:58.094688874 +0000 UTC m=+1254.898906479" Mar 11 01:14:59 crc kubenswrapper[4744]: I0311 01:14:59.266140 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 11 01:14:59 crc kubenswrapper[4744]: I0311 01:14:59.266187 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 11 01:14:59 crc kubenswrapper[4744]: I0311 01:14:59.379182 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.149392 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk"] Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.151101 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.154962 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.155271 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.176709 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk"] Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.234618 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.235298 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.235357 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.235443 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbth\" (UniqueName: \"kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.336962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.337025 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.337100 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbth\" (UniqueName: \"kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.338161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.358379 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.374717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbth\" (UniqueName: \"kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth\") pod \"collect-profiles-29553195-rs6nk\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.477172 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.589311 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.721175 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.721738 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.824196 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.856665 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:15:00 crc kubenswrapper[4744]: I0311 01:15:00.944166 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:15:01 crc kubenswrapper[4744]: I0311 01:15:00.995397 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk"] Mar 11 01:15:01 crc kubenswrapper[4744]: W0311 01:15:01.000383 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab006bc8_78da_42a8_9322_f52588f20622.slice/crio-0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97 WatchSource:0}: Error finding container 0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97: Status 404 returned error can't find the container with id 0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97 Mar 11 01:15:01 crc kubenswrapper[4744]: I0311 01:15:01.099323 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" event={"ID":"ab006bc8-78da-42a8-9322-f52588f20622","Type":"ContainerStarted","Data":"0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97"} Mar 11 01:15:01 crc kubenswrapper[4744]: I0311 01:15:01.099788 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="dnsmasq-dns" containerID="cri-o://2b406a55e25e1a06bba16de39922948d60d13a247841bfd1c50d5a47c7f6510a" gracePeriod=10 Mar 11 01:15:01 crc kubenswrapper[4744]: I0311 01:15:01.198278 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.063369 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4wphb"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.064891 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.097597 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8cg\" (UniqueName: \"kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.097731 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.098151 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4wphb"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.108419 4744 generic.go:334] "Generic (PLEG): container finished" podID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerID="2b406a55e25e1a06bba16de39922948d60d13a247841bfd1c50d5a47c7f6510a" exitCode=0 Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.108570 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" event={"ID":"c9410349-9a4f-42f6-81e4-dd4dd66abdcc","Type":"ContainerDied","Data":"2b406a55e25e1a06bba16de39922948d60d13a247841bfd1c50d5a47c7f6510a"} Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.111169 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ab0f-account-create-update-fj8mx"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.112332 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.114410 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.128259 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab0f-account-create-update-fj8mx"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.199498 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.199596 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8cg\" (UniqueName: \"kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.199643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.199917 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hsz\" (UniqueName: \"kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.200789 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.233629 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8cg\" (UniqueName: \"kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg\") pod \"keystone-db-create-4wphb\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.262309 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6q4fk"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.263897 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.285488 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bd14-account-create-update-bhb66"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.286791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.289978 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.293793 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6q4fk"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.300885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.300961 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxlv\" (UniqueName: \"kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.300989 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wl7l\" (UniqueName: \"kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.301030 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hsz\" (UniqueName: \"kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.301058 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.301086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.301704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.308708 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bd14-account-create-update-bhb66"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.324149 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hsz\" (UniqueName: \"kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz\") pod \"keystone-ab0f-account-create-update-fj8mx\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.379915 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.404552 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.405065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.405152 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxlv\" (UniqueName: \"kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.405186 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wl7l\" (UniqueName: \"kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.405368 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.406156 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.436557 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.436852 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxlv\" (UniqueName: \"kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv\") pod \"placement-db-create-6q4fk\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.437499 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wl7l\" (UniqueName: \"kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l\") pod \"placement-bd14-account-create-update-bhb66\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.615584 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.623033 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.818708 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4wphb"] Mar 11 01:15:02 crc kubenswrapper[4744]: W0311 01:15:02.845920 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d70370e_fce0_48d8_9856_06e04916e905.slice/crio-3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021 WatchSource:0}: Error finding container 3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021: Status 404 returned error can't find the container with id 3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021 Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.993588 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:15:02 crc kubenswrapper[4744]: I0311 01:15:02.995596 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.021633 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.042253 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab0f-account-create-update-fj8mx"] Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.121011 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.121159 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.121243 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.121302 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.121400 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcgb9\" (UniqueName: \"kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.142839 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab0f-account-create-update-fj8mx" event={"ID":"da71c584-ebbd-42c0-96c9-716bbd47efce","Type":"ContainerStarted","Data":"20eefcd470fd6e84965347ea5a26def1af611fe21be2777e69b3d62cd6c6e064"} Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.149904 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6q4fk"] Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.151423 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wphb" event={"ID":"3d70370e-fce0-48d8-9856-06e04916e905","Type":"ContainerStarted","Data":"3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021"} Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.153684 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bd14-account-create-update-bhb66"] Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.224780 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225046 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225121 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcgb9\" (UniqueName: \"kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225156 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225533 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.225841 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.226093 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.226641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.255976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcgb9\" (UniqueName: \"kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9\") pod \"dnsmasq-dns-58df884995-rf6pm\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.350928 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:03 crc kubenswrapper[4744]: W0311 01:15:03.819388 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d727dc8_84d3_45b0_90e8_22a1f3f043e1.slice/crio-59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23 WatchSource:0}: Error finding container 59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23: Status 404 returned error can't find the container with id 59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23 Mar 11 01:15:03 crc kubenswrapper[4744]: I0311 01:15:03.822666 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.140910 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.150951 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.152016 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.154333 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.154355 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.154365 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-cd6dj" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.154684 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.173593 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6q4fk" event={"ID":"e122b7f6-6664-4484-afc0-c5629ad3a7e3","Type":"ContainerStarted","Data":"ba1ca927cadae172ec238ab9545e61e8bb95a6fd668ed1689715b74b6c709b3d"} Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.175666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerStarted","Data":"59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23"} Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.178648 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bd14-account-create-update-bhb66" event={"ID":"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512","Type":"ContainerStarted","Data":"71bc0e3ddf9435d8cd0e3a439839dd457b30f5536cc470117a683a2983bf0e34"} Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243043 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243116 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243440 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64l2g\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.243591 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.345796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.345881 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64l2g\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.345912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.345960 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.345983 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.346005 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.346134 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.346148 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.346191 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift podName:524fac10-b874-465e-b4aa-221b6c689959 nodeName:}" failed. No retries permitted until 2026-03-11 01:15:04.846175849 +0000 UTC m=+1261.650393454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift") pod "swift-storage-0" (UID: "524fac10-b874-465e-b4aa-221b6c689959") : configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.346356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.346717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.346925 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.362037 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.375975 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vgn45"] Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.379603 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.381916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64l2g\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.386720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.386965 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.387025 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.387123 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.390650 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vgn45"] Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549335 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549369 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549460 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnxw4\" (UniqueName: \"kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549482 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549506 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.549541 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnxw4\" (UniqueName: \"kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651380 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651422 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.651536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.652984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.662038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.662139 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.662292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.662624 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.662758 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.686748 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnxw4\" (UniqueName: \"kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4\") pod \"swift-ring-rebalance-vgn45\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.707072 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.854977 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.855224 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.855259 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: E0311 01:15:04.855335 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift podName:524fac10-b874-465e-b4aa-221b6c689959 nodeName:}" failed. No retries permitted until 2026-03-11 01:15:05.855311471 +0000 UTC m=+1262.659529076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift") pod "swift-storage-0" (UID: "524fac10-b874-465e-b4aa-221b6c689959") : configmap "swift-ring-files" not found Mar 11 01:15:04 crc kubenswrapper[4744]: I0311 01:15:04.911842 4744 scope.go:117] "RemoveContainer" containerID="70710e4f74cdd2d5659fed13d5c78e26a14bc6f4243e778718bbd292280e80a2" Mar 11 01:15:05 crc kubenswrapper[4744]: I0311 01:15:05.191063 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerStarted","Data":"063a7c95da290d544b7923701cac3b4e437141eacc1fff4c7a6f62b0e75c7ba4"} Mar 11 01:15:05 crc kubenswrapper[4744]: I0311 01:15:05.213165 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vgn45"] Mar 11 01:15:05 crc kubenswrapper[4744]: W0311 01:15:05.216168 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20d64d92_7649_4a5e_af99_a92a08b47ecf.slice/crio-31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206 WatchSource:0}: Error finding container 31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206: Status 404 returned error can't find the container with id 31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206 Mar 11 01:15:05 crc kubenswrapper[4744]: I0311 01:15:05.588996 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 11 01:15:05 crc kubenswrapper[4744]: I0311 01:15:05.872871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:05 crc kubenswrapper[4744]: E0311 01:15:05.873081 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 01:15:05 crc kubenswrapper[4744]: E0311 01:15:05.873105 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 01:15:05 crc kubenswrapper[4744]: E0311 01:15:05.873157 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift podName:524fac10-b874-465e-b4aa-221b6c689959 nodeName:}" failed. No retries permitted until 2026-03-11 01:15:07.8731397 +0000 UTC m=+1264.677357305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift") pod "swift-storage-0" (UID: "524fac10-b874-465e-b4aa-221b6c689959") : configmap "swift-ring-files" not found Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.052683 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kh4gc"] Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.054840 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.063783 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kh4gc"] Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.160269 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-61e2-account-create-update-6wjlk"] Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.161961 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.163999 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.168578 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-61e2-account-create-update-6wjlk"] Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.181891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.182014 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw492\" (UniqueName: \"kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.202856 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerID="063a7c95da290d544b7923701cac3b4e437141eacc1fff4c7a6f62b0e75c7ba4" exitCode=0 Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.202922 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerDied","Data":"063a7c95da290d544b7923701cac3b4e437141eacc1fff4c7a6f62b0e75c7ba4"} Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.203991 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vgn45" event={"ID":"20d64d92-7649-4a5e-af99-a92a08b47ecf","Type":"ContainerStarted","Data":"31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206"} Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.283784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.283884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f94s\" (UniqueName: \"kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.283958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.283982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw492\" (UniqueName: \"kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.284560 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.307131 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw492\" (UniqueName: \"kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492\") pod \"glance-db-create-kh4gc\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.371445 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.385920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f94s\" (UniqueName: \"kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.386003 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.386850 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.402848 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f94s\" (UniqueName: \"kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s\") pod \"glance-61e2-account-create-update-6wjlk\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.476487 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:06 crc kubenswrapper[4744]: I0311 01:15:06.848457 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kh4gc"] Mar 11 01:15:06 crc kubenswrapper[4744]: W0311 01:15:06.856274 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd138bcef_d88d_4af3_8f41_f4804e583670.slice/crio-66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50 WatchSource:0}: Error finding container 66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50: Status 404 returned error can't find the container with id 66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50 Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.003268 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-61e2-account-create-update-6wjlk"] Mar 11 01:15:07 crc kubenswrapper[4744]: W0311 01:15:07.040499 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3946cf42_7442_4fad_b561_2050f9d26d8f.slice/crio-9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307 WatchSource:0}: Error finding container 9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307: Status 404 returned error can't find the container with id 9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307 Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.182478 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.217449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wphb" event={"ID":"3d70370e-fce0-48d8-9856-06e04916e905","Type":"ContainerStarted","Data":"3b99519301672447b89814ce6fee64db32f0b7eb950348bbc647f1820b678ac0"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.225186 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerStarted","Data":"6277045b67820d0a1857d93e6e8e3ca7197495ee0c3cd0064b91a9e4d115b55b"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.225853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.228646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-61e2-account-create-update-6wjlk" event={"ID":"3946cf42-7442-4fad-b561-2050f9d26d8f","Type":"ContainerStarted","Data":"9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.237362 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-4wphb" podStartSLOduration=5.237344495 podStartE2EDuration="5.237344495s" podCreationTimestamp="2026-03-11 01:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.229558324 +0000 UTC m=+1264.033775919" watchObservedRunningTime="2026-03-11 01:15:07.237344495 +0000 UTC m=+1264.041562100" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.239593 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bd14-account-create-update-bhb66" event={"ID":"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512","Type":"ContainerStarted","Data":"1aa521d4cbd225ff73bcece76cd081aee159ae63e3b4b1ec498f897e70556651"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.243934 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" event={"ID":"c9410349-9a4f-42f6-81e4-dd4dd66abdcc","Type":"ContainerDied","Data":"1fc6e944d84dc837a2f5776dc7a3074d088d374ca33393f450196c47b4775d31"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.243966 4744 scope.go:117] "RemoveContainer" containerID="2b406a55e25e1a06bba16de39922948d60d13a247841bfd1c50d5a47c7f6510a" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.243939 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-vpw7m" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.246155 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6q4fk" event={"ID":"e122b7f6-6664-4484-afc0-c5629ad3a7e3","Type":"ContainerStarted","Data":"d90c7a6c9305d356e7403d291e2f081e429e4966cb7e9f20d9b471cf46eddf16"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.251858 4744 generic.go:334] "Generic (PLEG): container finished" podID="ab006bc8-78da-42a8-9322-f52588f20622" containerID="cdbd07cc878eb88c67ce05d3b1694154a3da2389f594ee97e43fe4ee869192d7" exitCode=0 Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.251959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" event={"ID":"ab006bc8-78da-42a8-9322-f52588f20622","Type":"ContainerDied","Data":"cdbd07cc878eb88c67ce05d3b1694154a3da2389f594ee97e43fe4ee869192d7"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.253605 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58df884995-rf6pm" podStartSLOduration=5.253587609 podStartE2EDuration="5.253587609s" podCreationTimestamp="2026-03-11 01:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.249307326 +0000 UTC m=+1264.053524951" watchObservedRunningTime="2026-03-11 01:15:07.253587609 +0000 UTC m=+1264.057805214" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.255463 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kh4gc" event={"ID":"d138bcef-d88d-4af3-8f41-f4804e583670","Type":"ContainerStarted","Data":"d4e5360567689983645dc7f299ef53ab8dbb200c7bcb992f0ff7fcc3a8bffd18"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.255492 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kh4gc" event={"ID":"d138bcef-d88d-4af3-8f41-f4804e583670","Type":"ContainerStarted","Data":"66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.276605 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab0f-account-create-update-fj8mx" event={"ID":"da71c584-ebbd-42c0-96c9-716bbd47efce","Type":"ContainerStarted","Data":"eec6090a40721da764b37de1f739da92e8819f033338f4cf2e5731f03131daae"} Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.303525 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config\") pod \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.303705 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8qmr\" (UniqueName: \"kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr\") pod \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.303807 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb\") pod \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.303845 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc\") pod \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\" (UID: \"c9410349-9a4f-42f6-81e4-dd4dd66abdcc\") " Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.306951 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-bd14-account-create-update-bhb66" podStartSLOduration=5.306927803 podStartE2EDuration="5.306927803s" podCreationTimestamp="2026-03-11 01:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.26846177 +0000 UTC m=+1264.072679375" watchObservedRunningTime="2026-03-11 01:15:07.306927803 +0000 UTC m=+1264.111145408" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.312256 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6q4fk" podStartSLOduration=5.312220707 podStartE2EDuration="5.312220707s" podCreationTimestamp="2026-03-11 01:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.3029865 +0000 UTC m=+1264.107204105" watchObservedRunningTime="2026-03-11 01:15:07.312220707 +0000 UTC m=+1264.116438312" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.315721 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr" (OuterVolumeSpecName: "kube-api-access-z8qmr") pod "c9410349-9a4f-42f6-81e4-dd4dd66abdcc" (UID: "c9410349-9a4f-42f6-81e4-dd4dd66abdcc"). InnerVolumeSpecName "kube-api-access-z8qmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.318092 4744 scope.go:117] "RemoveContainer" containerID="3af06364499ebb35c2bd4fdb456a24fb081e41985deb636fdcb20b80b655adea" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.335579 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ab0f-account-create-update-fj8mx" podStartSLOduration=5.3355618400000004 podStartE2EDuration="5.33556184s" podCreationTimestamp="2026-03-11 01:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.315461357 +0000 UTC m=+1264.119678962" watchObservedRunningTime="2026-03-11 01:15:07.33556184 +0000 UTC m=+1264.139779445" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.348430 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9410349-9a4f-42f6-81e4-dd4dd66abdcc" (UID: "c9410349-9a4f-42f6-81e4-dd4dd66abdcc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.353860 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-kh4gc" podStartSLOduration=1.353841967 podStartE2EDuration="1.353841967s" podCreationTimestamp="2026-03-11 01:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:07.340006518 +0000 UTC m=+1264.144224123" watchObservedRunningTime="2026-03-11 01:15:07.353841967 +0000 UTC m=+1264.158059582" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.359186 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config" (OuterVolumeSpecName: "config") pod "c9410349-9a4f-42f6-81e4-dd4dd66abdcc" (UID: "c9410349-9a4f-42f6-81e4-dd4dd66abdcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.360292 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9410349-9a4f-42f6-81e4-dd4dd66abdcc" (UID: "c9410349-9a4f-42f6-81e4-dd4dd66abdcc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.408977 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8qmr\" (UniqueName: \"kubernetes.io/projected/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-kube-api-access-z8qmr\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.409010 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.409018 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.409027 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9410349-9a4f-42f6-81e4-dd4dd66abdcc-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.602368 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.611995 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-vpw7m"] Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.867088 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rz8p7"] Mar 11 01:15:07 crc kubenswrapper[4744]: E0311 01:15:07.867821 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="init" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.867901 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="init" Mar 11 01:15:07 crc kubenswrapper[4744]: E0311 01:15:07.867962 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="dnsmasq-dns" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.867972 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="dnsmasq-dns" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.868276 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" containerName="dnsmasq-dns" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.869109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.873856 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.879875 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rz8p7"] Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.919244 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:07 crc kubenswrapper[4744]: E0311 01:15:07.919551 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 01:15:07 crc kubenswrapper[4744]: E0311 01:15:07.919574 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 01:15:07 crc kubenswrapper[4744]: E0311 01:15:07.919617 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift podName:524fac10-b874-465e-b4aa-221b6c689959 nodeName:}" failed. No retries permitted until 2026-03-11 01:15:11.919602143 +0000 UTC m=+1268.723819748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift") pod "swift-storage-0" (UID: "524fac10-b874-465e-b4aa-221b6c689959") : configmap "swift-ring-files" not found Mar 11 01:15:07 crc kubenswrapper[4744]: I0311 01:15:07.990786 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9410349-9a4f-42f6-81e4-dd4dd66abdcc" path="/var/lib/kubelet/pods/c9410349-9a4f-42f6-81e4-dd4dd66abdcc/volumes" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.021238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwn8\" (UniqueName: \"kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.022780 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.124045 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwn8\" (UniqueName: \"kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.124156 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.125154 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.155177 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwn8\" (UniqueName: \"kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8\") pod \"root-account-create-update-rz8p7\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.238407 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.287803 4744 generic.go:334] "Generic (PLEG): container finished" podID="e122b7f6-6664-4484-afc0-c5629ad3a7e3" containerID="d90c7a6c9305d356e7403d291e2f081e429e4966cb7e9f20d9b471cf46eddf16" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.287848 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6q4fk" event={"ID":"e122b7f6-6664-4484-afc0-c5629ad3a7e3","Type":"ContainerDied","Data":"d90c7a6c9305d356e7403d291e2f081e429e4966cb7e9f20d9b471cf46eddf16"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.289624 4744 generic.go:334] "Generic (PLEG): container finished" podID="d138bcef-d88d-4af3-8f41-f4804e583670" containerID="d4e5360567689983645dc7f299ef53ab8dbb200c7bcb992f0ff7fcc3a8bffd18" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.289765 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kh4gc" event={"ID":"d138bcef-d88d-4af3-8f41-f4804e583670","Type":"ContainerDied","Data":"d4e5360567689983645dc7f299ef53ab8dbb200c7bcb992f0ff7fcc3a8bffd18"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.291331 4744 generic.go:334] "Generic (PLEG): container finished" podID="da71c584-ebbd-42c0-96c9-716bbd47efce" containerID="eec6090a40721da764b37de1f739da92e8819f033338f4cf2e5731f03131daae" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.291393 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab0f-account-create-update-fj8mx" event={"ID":"da71c584-ebbd-42c0-96c9-716bbd47efce","Type":"ContainerDied","Data":"eec6090a40721da764b37de1f739da92e8819f033338f4cf2e5731f03131daae"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.292468 4744 generic.go:334] "Generic (PLEG): container finished" podID="3946cf42-7442-4fad-b561-2050f9d26d8f" containerID="06a3af48518031c156f51eb5607cfcb96d799cb1d1c4bc03f941155347f77f71" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.292524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-61e2-account-create-update-6wjlk" event={"ID":"3946cf42-7442-4fad-b561-2050f9d26d8f","Type":"ContainerDied","Data":"06a3af48518031c156f51eb5607cfcb96d799cb1d1c4bc03f941155347f77f71"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.293982 4744 generic.go:334] "Generic (PLEG): container finished" podID="ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" containerID="1aa521d4cbd225ff73bcece76cd081aee159ae63e3b4b1ec498f897e70556651" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.294050 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bd14-account-create-update-bhb66" event={"ID":"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512","Type":"ContainerDied","Data":"1aa521d4cbd225ff73bcece76cd081aee159ae63e3b4b1ec498f897e70556651"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.297599 4744 generic.go:334] "Generic (PLEG): container finished" podID="3d70370e-fce0-48d8-9856-06e04916e905" containerID="3b99519301672447b89814ce6fee64db32f0b7eb950348bbc647f1820b678ac0" exitCode=0 Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.298238 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wphb" event={"ID":"3d70370e-fce0-48d8-9856-06e04916e905","Type":"ContainerDied","Data":"3b99519301672447b89814ce6fee64db32f0b7eb950348bbc647f1820b678ac0"} Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.667807 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.834213 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume\") pod \"ab006bc8-78da-42a8-9322-f52588f20622\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.834294 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfbth\" (UniqueName: \"kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth\") pod \"ab006bc8-78da-42a8-9322-f52588f20622\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.834425 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume\") pod \"ab006bc8-78da-42a8-9322-f52588f20622\" (UID: \"ab006bc8-78da-42a8-9322-f52588f20622\") " Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.835328 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab006bc8-78da-42a8-9322-f52588f20622" (UID: "ab006bc8-78da-42a8-9322-f52588f20622"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.842131 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth" (OuterVolumeSpecName: "kube-api-access-wfbth") pod "ab006bc8-78da-42a8-9322-f52588f20622" (UID: "ab006bc8-78da-42a8-9322-f52588f20622"). InnerVolumeSpecName "kube-api-access-wfbth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.849448 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab006bc8-78da-42a8-9322-f52588f20622" (UID: "ab006bc8-78da-42a8-9322-f52588f20622"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.935935 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab006bc8-78da-42a8-9322-f52588f20622-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.935965 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab006bc8-78da-42a8-9322-f52588f20622-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:08 crc kubenswrapper[4744]: I0311 01:15:08.935981 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfbth\" (UniqueName: \"kubernetes.io/projected/ab006bc8-78da-42a8-9322-f52588f20622-kube-api-access-wfbth\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:09 crc kubenswrapper[4744]: I0311 01:15:09.329073 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" Mar 11 01:15:09 crc kubenswrapper[4744]: I0311 01:15:09.329115 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk" event={"ID":"ab006bc8-78da-42a8-9322-f52588f20622","Type":"ContainerDied","Data":"0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97"} Mar 11 01:15:09 crc kubenswrapper[4744]: I0311 01:15:09.329193 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0bee3c2e7e3b492e764aeb4947baa01332bf4396c8f040929930c215cf3b97" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.321827 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.342028 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.349726 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-61e2-account-create-update-6wjlk" event={"ID":"3946cf42-7442-4fad-b561-2050f9d26d8f","Type":"ContainerDied","Data":"9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.349768 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9095743c316d8cd397ae1bebb16e22092c0b5010b5e3a5e78f039d6acf146307" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.351742 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bd14-account-create-update-bhb66" event={"ID":"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512","Type":"ContainerDied","Data":"71bc0e3ddf9435d8cd0e3a439839dd457b30f5536cc470117a683a2983bf0e34"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.351792 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71bc0e3ddf9435d8cd0e3a439839dd457b30f5536cc470117a683a2983bf0e34" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.351752 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-61e2-account-create-update-6wjlk" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.351858 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bd14-account-create-update-bhb66" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.353822 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wphb" event={"ID":"3d70370e-fce0-48d8-9856-06e04916e905","Type":"ContainerDied","Data":"3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.353847 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.356182 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6q4fk" event={"ID":"e122b7f6-6664-4484-afc0-c5629ad3a7e3","Type":"ContainerDied","Data":"ba1ca927cadae172ec238ab9545e61e8bb95a6fd668ed1689715b74b6c709b3d"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.356219 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1ca927cadae172ec238ab9545e61e8bb95a6fd668ed1689715b74b6c709b3d" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.362733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kh4gc" event={"ID":"d138bcef-d88d-4af3-8f41-f4804e583670","Type":"ContainerDied","Data":"66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.362787 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a52b823b3e65b205cc595463d950c61bc97337ac253b6b4d6ea9babd1a0b50" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.364882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab0f-account-create-update-fj8mx" event={"ID":"da71c584-ebbd-42c0-96c9-716bbd47efce","Type":"ContainerDied","Data":"20eefcd470fd6e84965347ea5a26def1af611fe21be2777e69b3d62cd6c6e064"} Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.364924 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20eefcd470fd6e84965347ea5a26def1af611fe21be2777e69b3d62cd6c6e064" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.424573 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.442255 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.446228 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.470114 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts\") pod \"3946cf42-7442-4fad-b561-2050f9d26d8f\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.470182 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f94s\" (UniqueName: \"kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s\") pod \"3946cf42-7442-4fad-b561-2050f9d26d8f\" (UID: \"3946cf42-7442-4fad-b561-2050f9d26d8f\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.470263 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts\") pod \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.470314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wl7l\" (UniqueName: \"kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l\") pod \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\" (UID: \"ec0bae40-f9bc-4bc7-81e3-684d3f8a6512\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.470905 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3946cf42-7442-4fad-b561-2050f9d26d8f" (UID: "3946cf42-7442-4fad-b561-2050f9d26d8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.471171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" (UID: "ec0bae40-f9bc-4bc7-81e3-684d3f8a6512"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.474597 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s" (OuterVolumeSpecName: "kube-api-access-4f94s") pod "3946cf42-7442-4fad-b561-2050f9d26d8f" (UID: "3946cf42-7442-4fad-b561-2050f9d26d8f"). InnerVolumeSpecName "kube-api-access-4f94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.476774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l" (OuterVolumeSpecName: "kube-api-access-9wl7l") pod "ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" (UID: "ec0bae40-f9bc-4bc7-81e3-684d3f8a6512"). InnerVolumeSpecName "kube-api-access-9wl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.484176 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.571986 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjxlv\" (UniqueName: \"kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv\") pod \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572029 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hsz\" (UniqueName: \"kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz\") pod \"da71c584-ebbd-42c0-96c9-716bbd47efce\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq8cg\" (UniqueName: \"kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg\") pod \"3d70370e-fce0-48d8-9856-06e04916e905\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572111 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts\") pod \"3d70370e-fce0-48d8-9856-06e04916e905\" (UID: \"3d70370e-fce0-48d8-9856-06e04916e905\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572141 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw492\" (UniqueName: \"kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492\") pod \"d138bcef-d88d-4af3-8f41-f4804e583670\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572178 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts\") pod \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\" (UID: \"e122b7f6-6664-4484-afc0-c5629ad3a7e3\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572216 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts\") pod \"da71c584-ebbd-42c0-96c9-716bbd47efce\" (UID: \"da71c584-ebbd-42c0-96c9-716bbd47efce\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572296 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts\") pod \"d138bcef-d88d-4af3-8f41-f4804e583670\" (UID: \"d138bcef-d88d-4af3-8f41-f4804e583670\") " Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572635 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572654 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wl7l\" (UniqueName: \"kubernetes.io/projected/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512-kube-api-access-9wl7l\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572665 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3946cf42-7442-4fad-b561-2050f9d26d8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.572675 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f94s\" (UniqueName: \"kubernetes.io/projected/3946cf42-7442-4fad-b561-2050f9d26d8f-kube-api-access-4f94s\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.573096 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da71c584-ebbd-42c0-96c9-716bbd47efce" (UID: "da71c584-ebbd-42c0-96c9-716bbd47efce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.573093 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e122b7f6-6664-4484-afc0-c5629ad3a7e3" (UID: "e122b7f6-6664-4484-afc0-c5629ad3a7e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.573229 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d70370e-fce0-48d8-9856-06e04916e905" (UID: "3d70370e-fce0-48d8-9856-06e04916e905"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.573424 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d138bcef-d88d-4af3-8f41-f4804e583670" (UID: "d138bcef-d88d-4af3-8f41-f4804e583670"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.575762 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg" (OuterVolumeSpecName: "kube-api-access-fq8cg") pod "3d70370e-fce0-48d8-9856-06e04916e905" (UID: "3d70370e-fce0-48d8-9856-06e04916e905"). InnerVolumeSpecName "kube-api-access-fq8cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.577635 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492" (OuterVolumeSpecName: "kube-api-access-qw492") pod "d138bcef-d88d-4af3-8f41-f4804e583670" (UID: "d138bcef-d88d-4af3-8f41-f4804e583670"). InnerVolumeSpecName "kube-api-access-qw492". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.578135 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz" (OuterVolumeSpecName: "kube-api-access-h4hsz") pod "da71c584-ebbd-42c0-96c9-716bbd47efce" (UID: "da71c584-ebbd-42c0-96c9-716bbd47efce"). InnerVolumeSpecName "kube-api-access-h4hsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.578166 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv" (OuterVolumeSpecName: "kube-api-access-bjxlv") pod "e122b7f6-6664-4484-afc0-c5629ad3a7e3" (UID: "e122b7f6-6664-4484-afc0-c5629ad3a7e3"). InnerVolumeSpecName "kube-api-access-bjxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.667954 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rz8p7"] Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675732 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d138bcef-d88d-4af3-8f41-f4804e583670-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675780 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjxlv\" (UniqueName: \"kubernetes.io/projected/e122b7f6-6664-4484-afc0-c5629ad3a7e3-kube-api-access-bjxlv\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675799 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hsz\" (UniqueName: \"kubernetes.io/projected/da71c584-ebbd-42c0-96c9-716bbd47efce-kube-api-access-h4hsz\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675815 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq8cg\" (UniqueName: \"kubernetes.io/projected/3d70370e-fce0-48d8-9856-06e04916e905-kube-api-access-fq8cg\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675830 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d70370e-fce0-48d8-9856-06e04916e905-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675847 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw492\" (UniqueName: \"kubernetes.io/projected/d138bcef-d88d-4af3-8f41-f4804e583670-kube-api-access-qw492\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675862 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e122b7f6-6664-4484-afc0-c5629ad3a7e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:10 crc kubenswrapper[4744]: I0311 01:15:10.675876 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da71c584-ebbd-42c0-96c9-716bbd47efce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.390020 4744 generic.go:334] "Generic (PLEG): container finished" podID="9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" containerID="4fcc7b63cdb756d5d1faef2219f8ea9445e24b55a6b5b6e5dfb86462daed1a62" exitCode=0 Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.390148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rz8p7" event={"ID":"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b","Type":"ContainerDied","Data":"4fcc7b63cdb756d5d1faef2219f8ea9445e24b55a6b5b6e5dfb86462daed1a62"} Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.390490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rz8p7" event={"ID":"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b","Type":"ContainerStarted","Data":"148b9510988da3284808d15ad7f801ff77a367ba287d4b75be9d980a0f593995"} Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.394827 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wphb" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.394851 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vgn45" event={"ID":"20d64d92-7649-4a5e-af99-a92a08b47ecf","Type":"ContainerStarted","Data":"0db3f9268dd586840a750ce3aab8eee17063d00b89aeb58328fea2ea809f3bd6"} Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.394913 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-fj8mx" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.394956 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6q4fk" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.394971 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kh4gc" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.471214 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vgn45" podStartSLOduration=2.44484686 podStartE2EDuration="7.47119197s" podCreationTimestamp="2026-03-11 01:15:04 +0000 UTC" firstStartedPulling="2026-03-11 01:15:05.220286144 +0000 UTC m=+1262.024503769" lastFinishedPulling="2026-03-11 01:15:10.246631274 +0000 UTC m=+1267.050848879" observedRunningTime="2026-03-11 01:15:11.455979379 +0000 UTC m=+1268.260196984" watchObservedRunningTime="2026-03-11 01:15:11.47119197 +0000 UTC m=+1268.275409575" Mar 11 01:15:11 crc kubenswrapper[4744]: E0311 01:15:11.583773 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d70370e_fce0_48d8_9856_06e04916e905.slice/crio-3934be1d90a0207905146705af92f15b5c5714ad31c516779ec394c076ce1021\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda71c584_ebbd_42c0_96c9_716bbd47efce.slice/crio-20eefcd470fd6e84965347ea5a26def1af611fe21be2777e69b3d62cd6c6e064\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd138bcef_d88d_4af3_8f41_f4804e583670.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode122b7f6_6664_4484_afc0_c5629ad3a7e3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d70370e_fce0_48d8_9856_06e04916e905.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode122b7f6_6664_4484_afc0_c5629ad3a7e3.slice/crio-ba1ca927cadae172ec238ab9545e61e8bb95a6fd668ed1689715b74b6c709b3d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda71c584_ebbd_42c0_96c9_716bbd47efce.slice\": RecentStats: unable to find data in memory cache]" Mar 11 01:15:11 crc kubenswrapper[4744]: I0311 01:15:11.998928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:11 crc kubenswrapper[4744]: E0311 01:15:11.999158 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 01:15:11 crc kubenswrapper[4744]: E0311 01:15:11.999185 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 01:15:11 crc kubenswrapper[4744]: E0311 01:15:11.999257 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift podName:524fac10-b874-465e-b4aa-221b6c689959 nodeName:}" failed. No retries permitted until 2026-03-11 01:15:19.999235049 +0000 UTC m=+1276.803452694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift") pod "swift-storage-0" (UID: "524fac10-b874-465e-b4aa-221b6c689959") : configmap "swift-ring-files" not found Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.404821 4744 generic.go:334] "Generic (PLEG): container finished" podID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerID="e724fad610e3cb354b224dbc23638db68990df9c737ed272890fd1779688fc45" exitCode=0 Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.404894 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerDied","Data":"e724fad610e3cb354b224dbc23638db68990df9c737ed272890fd1779688fc45"} Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.407144 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerID="dd9f74256d9d36d7d93b0c687c50126a3012a98632ea369b61ca6eb2ada71f31" exitCode=0 Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.407197 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerDied","Data":"dd9f74256d9d36d7d93b0c687c50126a3012a98632ea369b61ca6eb2ada71f31"} Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.765780 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.912448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts\") pod \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.912681 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvwn8\" (UniqueName: \"kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8\") pod \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\" (UID: \"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b\") " Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.913166 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" (UID: "9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:12 crc kubenswrapper[4744]: I0311 01:15:12.920988 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8" (OuterVolumeSpecName: "kube-api-access-zvwn8") pod "9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" (UID: "9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b"). InnerVolumeSpecName "kube-api-access-zvwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.014936 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.014966 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvwn8\" (UniqueName: \"kubernetes.io/projected/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b-kube-api-access-zvwn8\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.353320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.417819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerStarted","Data":"2d7e9342156b6a7e0b5782247ec6e299cdb60a6da7997fe5146c00f779c615e6"} Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.418022 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.420504 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.420725 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="dnsmasq-dns" containerID="cri-o://4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db" gracePeriod=10 Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.421017 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerStarted","Data":"f517c72839363553e8b786dec7b9824c28bc5f5e37822956cd45b454cd30e224"} Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.421169 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.422657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rz8p7" event={"ID":"9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b","Type":"ContainerDied","Data":"148b9510988da3284808d15ad7f801ff77a367ba287d4b75be9d980a0f593995"} Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.422694 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="148b9510988da3284808d15ad7f801ff77a367ba287d4b75be9d980a0f593995" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.422740 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rz8p7" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.466958 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.62977018 podStartE2EDuration="57.466941762s" podCreationTimestamp="2026-03-11 01:14:16 +0000 UTC" firstStartedPulling="2026-03-11 01:14:18.372205311 +0000 UTC m=+1215.176422916" lastFinishedPulling="2026-03-11 01:14:39.209376893 +0000 UTC m=+1236.013594498" observedRunningTime="2026-03-11 01:15:13.462431172 +0000 UTC m=+1270.266648767" watchObservedRunningTime="2026-03-11 01:15:13.466941762 +0000 UTC m=+1270.271159367" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.513737 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.837455827 podStartE2EDuration="57.513721733s" podCreationTimestamp="2026-03-11 01:14:16 +0000 UTC" firstStartedPulling="2026-03-11 01:14:18.526046019 +0000 UTC m=+1215.330263624" lastFinishedPulling="2026-03-11 01:14:39.202311925 +0000 UTC m=+1236.006529530" observedRunningTime="2026-03-11 01:15:13.510765691 +0000 UTC m=+1270.314983296" watchObservedRunningTime="2026-03-11 01:15:13.513721733 +0000 UTC m=+1270.317939338" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.879148 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.927847 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmw94\" (UniqueName: \"kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94\") pod \"102a3691-340d-4bed-b87f-7bebbdb1f819\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.928332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc\") pod \"102a3691-340d-4bed-b87f-7bebbdb1f819\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.928417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb\") pod \"102a3691-340d-4bed-b87f-7bebbdb1f819\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.928859 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb\") pod \"102a3691-340d-4bed-b87f-7bebbdb1f819\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.928905 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config\") pod \"102a3691-340d-4bed-b87f-7bebbdb1f819\" (UID: \"102a3691-340d-4bed-b87f-7bebbdb1f819\") " Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.934144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94" (OuterVolumeSpecName: "kube-api-access-dmw94") pod "102a3691-340d-4bed-b87f-7bebbdb1f819" (UID: "102a3691-340d-4bed-b87f-7bebbdb1f819"). InnerVolumeSpecName "kube-api-access-dmw94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.973125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "102a3691-340d-4bed-b87f-7bebbdb1f819" (UID: "102a3691-340d-4bed-b87f-7bebbdb1f819"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.981752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "102a3691-340d-4bed-b87f-7bebbdb1f819" (UID: "102a3691-340d-4bed-b87f-7bebbdb1f819"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.986184 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "102a3691-340d-4bed-b87f-7bebbdb1f819" (UID: "102a3691-340d-4bed-b87f-7bebbdb1f819"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:13 crc kubenswrapper[4744]: I0311 01:15:13.991119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config" (OuterVolumeSpecName: "config") pod "102a3691-340d-4bed-b87f-7bebbdb1f819" (UID: "102a3691-340d-4bed-b87f-7bebbdb1f819"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.030429 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.031902 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.031927 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.031941 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102a3691-340d-4bed-b87f-7bebbdb1f819-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.031955 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmw94\" (UniqueName: \"kubernetes.io/projected/102a3691-340d-4bed-b87f-7bebbdb1f819-kube-api-access-dmw94\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.436233 4744 generic.go:334] "Generic (PLEG): container finished" podID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerID="4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db" exitCode=0 Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.436334 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.436386 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" event={"ID":"102a3691-340d-4bed-b87f-7bebbdb1f819","Type":"ContainerDied","Data":"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db"} Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.436429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-fzrlb" event={"ID":"102a3691-340d-4bed-b87f-7bebbdb1f819","Type":"ContainerDied","Data":"827ab222e296817a2f655d1a990685bbc0af31ce8098f996b8aeca7087e9f29a"} Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.436454 4744 scope.go:117] "RemoveContainer" containerID="4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.465912 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rz8p7"] Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.473031 4744 scope.go:117] "RemoveContainer" containerID="805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.476162 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rz8p7"] Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.487991 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.500763 4744 scope.go:117] "RemoveContainer" containerID="4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db" Mar 11 01:15:14 crc kubenswrapper[4744]: E0311 01:15:14.518900 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db\": container with ID starting with 4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db not found: ID does not exist" containerID="4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.518945 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db"} err="failed to get container status \"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db\": rpc error: code = NotFound desc = could not find container \"4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db\": container with ID starting with 4a09c8e658dc3c74f1c97e89a6efb5ad2c66f6f6b9ad50d395537e599d8042db not found: ID does not exist" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.518974 4744 scope.go:117] "RemoveContainer" containerID="805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.519129 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-fzrlb"] Mar 11 01:15:14 crc kubenswrapper[4744]: E0311 01:15:14.519170 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c\": container with ID starting with 805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c not found: ID does not exist" containerID="805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c" Mar 11 01:15:14 crc kubenswrapper[4744]: I0311 01:15:14.519187 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c"} err="failed to get container status \"805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c\": rpc error: code = NotFound desc = could not find container \"805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c\": container with ID starting with 805c5ee669d3a91fb094d5e3d7be68d7ea7159bdedf46d5db894bc3725e4ac2c not found: ID does not exist" Mar 11 01:15:15 crc kubenswrapper[4744]: I0311 01:15:15.687074 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 11 01:15:15 crc kubenswrapper[4744]: I0311 01:15:15.985720 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" path="/var/lib/kubelet/pods/102a3691-340d-4bed-b87f-7bebbdb1f819/volumes" Mar 11 01:15:15 crc kubenswrapper[4744]: I0311 01:15:15.986862 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" path="/var/lib/kubelet/pods/9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b/volumes" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.310695 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hhwh7"] Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.310997 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e122b7f6-6664-4484-afc0-c5629ad3a7e3" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311017 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e122b7f6-6664-4484-afc0-c5629ad3a7e3" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311033 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="init" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311040 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="init" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311049 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311058 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311070 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da71c584-ebbd-42c0-96c9-716bbd47efce" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311077 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da71c584-ebbd-42c0-96c9-716bbd47efce" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311092 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d138bcef-d88d-4af3-8f41-f4804e583670" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311098 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d138bcef-d88d-4af3-8f41-f4804e583670" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311108 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311117 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311133 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="dnsmasq-dns" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311139 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="dnsmasq-dns" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311146 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d70370e-fce0-48d8-9856-06e04916e905" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311153 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d70370e-fce0-48d8-9856-06e04916e905" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311162 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab006bc8-78da-42a8-9322-f52588f20622" containerName="collect-profiles" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311167 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab006bc8-78da-42a8-9322-f52588f20622" containerName="collect-profiles" Mar 11 01:15:16 crc kubenswrapper[4744]: E0311 01:15:16.311178 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3946cf42-7442-4fad-b561-2050f9d26d8f" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311183 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3946cf42-7442-4fad-b561-2050f9d26d8f" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311334 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e122b7f6-6664-4484-afc0-c5629ad3a7e3" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311347 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d70370e-fce0-48d8-9856-06e04916e905" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311354 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311362 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d138bcef-d88d-4af3-8f41-f4804e583670" containerName="mariadb-database-create" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311371 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da71c584-ebbd-42c0-96c9-716bbd47efce" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311378 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab006bc8-78da-42a8-9322-f52588f20622" containerName="collect-profiles" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311389 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dcf4eeb-bf9a-4e68-b2f6-a8de4386d24b" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311398 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3946cf42-7442-4fad-b561-2050f9d26d8f" containerName="mariadb-account-create-update" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311409 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="102a3691-340d-4bed-b87f-7bebbdb1f819" containerName="dnsmasq-dns" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.311905 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.314407 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mfg2r" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.316072 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.330234 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hhwh7"] Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.371579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzl7v\" (UniqueName: \"kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.371645 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.371687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.371745 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.473164 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.473231 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.473490 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzl7v\" (UniqueName: \"kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.473556 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.479722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.480787 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.505063 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.505129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzl7v\" (UniqueName: \"kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v\") pod \"glance-db-sync-hhwh7\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:16 crc kubenswrapper[4744]: I0311 01:15:16.627270 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.203308 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hhwh7"] Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.462540 4744 generic.go:334] "Generic (PLEG): container finished" podID="20d64d92-7649-4a5e-af99-a92a08b47ecf" containerID="0db3f9268dd586840a750ce3aab8eee17063d00b89aeb58328fea2ea809f3bd6" exitCode=0 Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.462621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vgn45" event={"ID":"20d64d92-7649-4a5e-af99-a92a08b47ecf","Type":"ContainerDied","Data":"0db3f9268dd586840a750ce3aab8eee17063d00b89aeb58328fea2ea809f3bd6"} Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.463844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhwh7" event={"ID":"0fbc16eb-59fb-4814-b3b7-944573b75d23","Type":"ContainerStarted","Data":"4b8fdeb82a144189474c9344c568e70bd5ae1f8f180eba0b100e431de8389884"} Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.898801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nv5qw"] Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.900301 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.908715 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nv5qw"] Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.943149 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.999787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:17 crc kubenswrapper[4744]: I0311 01:15:17.999824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmst\" (UniqueName: \"kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:18 crc kubenswrapper[4744]: I0311 01:15:18.102330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:18 crc kubenswrapper[4744]: I0311 01:15:18.102401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmst\" (UniqueName: \"kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:18 crc kubenswrapper[4744]: I0311 01:15:18.103268 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:18 crc kubenswrapper[4744]: I0311 01:15:18.123533 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmst\" (UniqueName: \"kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst\") pod \"root-account-create-update-nv5qw\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:18 crc kubenswrapper[4744]: I0311 01:15:18.260671 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.698436 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nv5qw"] Mar 11 01:15:19 crc kubenswrapper[4744]: W0311 01:15:18.716602 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod584fceb8_c08c_4f5e_b2e4_fce4d07ea030.slice/crio-297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de WatchSource:0}: Error finding container 297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de: Status 404 returned error can't find the container with id 297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.784540 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818232 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818301 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818325 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.818815 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnxw4\" (UniqueName: \"kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4\") pod \"20d64d92-7649-4a5e-af99-a92a08b47ecf\" (UID: \"20d64d92-7649-4a5e-af99-a92a08b47ecf\") " Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.819083 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.819538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.824354 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4" (OuterVolumeSpecName: "kube-api-access-rnxw4") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "kube-api-access-rnxw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.826477 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.858933 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.860404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts" (OuterVolumeSpecName: "scripts") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.864158 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "20d64d92-7649-4a5e-af99-a92a08b47ecf" (UID: "20d64d92-7649-4a5e-af99-a92a08b47ecf"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.920702 4744 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921016 4744 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921028 4744 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921038 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnxw4\" (UniqueName: \"kubernetes.io/projected/20d64d92-7649-4a5e-af99-a92a08b47ecf-kube-api-access-rnxw4\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921051 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d64d92-7649-4a5e-af99-a92a08b47ecf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921059 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d64d92-7649-4a5e-af99-a92a08b47ecf-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:18.921069 4744 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20d64d92-7649-4a5e-af99-a92a08b47ecf-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.482963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vgn45" event={"ID":"20d64d92-7649-4a5e-af99-a92a08b47ecf","Type":"ContainerDied","Data":"31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206"} Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.483005 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c7c3063d5b2d93b4570ef046e6412085ca717897c334d78290472c58d69206" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.483024 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vgn45" Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.484209 4744 generic.go:334] "Generic (PLEG): container finished" podID="584fceb8-c08c-4f5e-b2e4-fce4d07ea030" containerID="a44ee64c6826fd9da076bf430593b7dd5a4963b74804af985f86a5ed00395e74" exitCode=0 Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.484241 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nv5qw" event={"ID":"584fceb8-c08c-4f5e-b2e4-fce4d07ea030","Type":"ContainerDied","Data":"a44ee64c6826fd9da076bf430593b7dd5a4963b74804af985f86a5ed00395e74"} Mar 11 01:15:19 crc kubenswrapper[4744]: I0311 01:15:19.484254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nv5qw" event={"ID":"584fceb8-c08c-4f5e-b2e4-fce4d07ea030","Type":"ContainerStarted","Data":"297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de"} Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.041998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.061773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"swift-storage-0\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " pod="openstack/swift-storage-0" Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.072233 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.624600 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:15:20 crc kubenswrapper[4744]: W0311 01:15:20.680815 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod524fac10_b874_465e_b4aa_221b6c689959.slice/crio-9837290bf3ac08505bb72377e073fa947a1426eeec53c677c31fef53c43ad429 WatchSource:0}: Error finding container 9837290bf3ac08505bb72377e073fa947a1426eeec53c677c31fef53c43ad429: Status 404 returned error can't find the container with id 9837290bf3ac08505bb72377e073fa947a1426eeec53c677c31fef53c43ad429 Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.899108 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.958680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts\") pod \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.958863 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjmst\" (UniqueName: \"kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst\") pod \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\" (UID: \"584fceb8-c08c-4f5e-b2e4-fce4d07ea030\") " Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.959640 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "584fceb8-c08c-4f5e-b2e4-fce4d07ea030" (UID: "584fceb8-c08c-4f5e-b2e4-fce4d07ea030"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:20 crc kubenswrapper[4744]: I0311 01:15:20.964309 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst" (OuterVolumeSpecName: "kube-api-access-gjmst") pod "584fceb8-c08c-4f5e-b2e4-fce4d07ea030" (UID: "584fceb8-c08c-4f5e-b2e4-fce4d07ea030"). InnerVolumeSpecName "kube-api-access-gjmst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.062130 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.062158 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjmst\" (UniqueName: \"kubernetes.io/projected/584fceb8-c08c-4f5e-b2e4-fce4d07ea030-kube-api-access-gjmst\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.503935 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nv5qw" Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.503937 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nv5qw" event={"ID":"584fceb8-c08c-4f5e-b2e4-fce4d07ea030","Type":"ContainerDied","Data":"297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de"} Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.503982 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="297aa2cb2563825c350018cc819bc3ed060d6a546827622eae5c81d61a6c39de" Mar 11 01:15:21 crc kubenswrapper[4744]: I0311 01:15:21.506972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"9837290bf3ac08505bb72377e073fa947a1426eeec53c677c31fef53c43ad429"} Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.001040 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2mjl7" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" probeResult="failure" output=< Mar 11 01:15:22 crc kubenswrapper[4744]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 01:15:22 crc kubenswrapper[4744]: > Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.019782 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.022413 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.288268 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mjl7-config-4rg2z"] Mar 11 01:15:22 crc kubenswrapper[4744]: E0311 01:15:22.288724 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584fceb8-c08c-4f5e-b2e4-fce4d07ea030" containerName="mariadb-account-create-update" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.288744 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="584fceb8-c08c-4f5e-b2e4-fce4d07ea030" containerName="mariadb-account-create-update" Mar 11 01:15:22 crc kubenswrapper[4744]: E0311 01:15:22.288766 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d64d92-7649-4a5e-af99-a92a08b47ecf" containerName="swift-ring-rebalance" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.288777 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d64d92-7649-4a5e-af99-a92a08b47ecf" containerName="swift-ring-rebalance" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.288965 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="584fceb8-c08c-4f5e-b2e4-fce4d07ea030" containerName="mariadb-account-create-update" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.288998 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d64d92-7649-4a5e-af99-a92a08b47ecf" containerName="swift-ring-rebalance" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.289701 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.292383 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.300050 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7-config-4rg2z"] Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.390936 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.390984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.391091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt4lf\" (UniqueName: \"kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.391122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.391228 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.391279 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492504 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492668 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt4lf\" (UniqueName: \"kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492753 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492897 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.492984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.493066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.493628 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.495339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.511810 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt4lf\" (UniqueName: \"kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf\") pod \"ovn-controller-2mjl7-config-4rg2z\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:22 crc kubenswrapper[4744]: I0311 01:15:22.612024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:24 crc kubenswrapper[4744]: I0311 01:15:24.488544 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nv5qw"] Mar 11 01:15:24 crc kubenswrapper[4744]: I0311 01:15:24.501292 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nv5qw"] Mar 11 01:15:25 crc kubenswrapper[4744]: I0311 01:15:25.997338 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584fceb8-c08c-4f5e-b2e4-fce4d07ea030" path="/var/lib/kubelet/pods/584fceb8-c08c-4f5e-b2e4-fce4d07ea030/volumes" Mar 11 01:15:27 crc kubenswrapper[4744]: I0311 01:15:27.022048 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2mjl7" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" probeResult="failure" output=< Mar 11 01:15:27 crc kubenswrapper[4744]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 01:15:27 crc kubenswrapper[4744]: > Mar 11 01:15:27 crc kubenswrapper[4744]: I0311 01:15:27.797549 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:15:28 crc kubenswrapper[4744]: I0311 01:15:28.055875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 01:15:29 crc kubenswrapper[4744]: E0311 01:15:29.270046 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.58:35486->38.102.83.58:46419: write tcp 38.102.83.58:35486->38.102.83.58:46419: write: broken pipe Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.550635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hcnrn"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.551655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.566666 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hcnrn"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.572050 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vw5rd"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.573163 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.577330 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.615933 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vw5rd"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.640034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.640102 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scn7d\" (UniqueName: \"kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.663010 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b858-account-create-update-hl4l5"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.664087 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.666092 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.678466 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qnfhv"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.679500 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.689257 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b858-account-create-update-hl4l5"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.697528 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qnfhv"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.742443 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.742490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.742539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45n5w\" (UniqueName: \"kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.742589 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scn7d\" (UniqueName: \"kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.743884 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.759851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scn7d\" (UniqueName: \"kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d\") pod \"root-account-create-update-vw5rd\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.844157 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdtg\" (UniqueName: \"kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.844203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.844234 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45n5w\" (UniqueName: \"kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.844958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tpk\" (UniqueName: \"kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.844901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.845041 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.845093 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.850654 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tld22"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.851599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tld22" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.866494 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5453-account-create-update-69w9d"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.867722 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.872189 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.876392 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tld22"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.879195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45n5w\" (UniqueName: \"kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w\") pod \"cinder-db-create-hcnrn\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.882159 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5453-account-create-update-69w9d"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.889931 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.932012 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-r8hsb"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.932921 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.938269 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.938375 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.938494 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.938728 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r54bc" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946401 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946454 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946487 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m84zj\" (UniqueName: \"kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946586 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946608 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9g8\" (UniqueName: \"kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdtg\" (UniqueName: \"kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrptn\" (UniqueName: \"kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946677 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tpk\" (UniqueName: \"kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.946961 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-r8hsb"] Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.947065 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.947488 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.963958 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdtg\" (UniqueName: \"kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg\") pod \"barbican-db-create-qnfhv\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.964338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tpk\" (UniqueName: \"kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk\") pod \"cinder-b858-account-create-update-hl4l5\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.988572 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:29 crc kubenswrapper[4744]: I0311 01:15:29.995670 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047619 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047671 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m84zj\" (UniqueName: \"kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047754 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9g8\" (UniqueName: \"kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrptn\" (UniqueName: \"kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047825 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.047840 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.048972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.049367 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.053029 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.063744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9g8\" (UniqueName: \"kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8\") pod \"barbican-5453-account-create-update-69w9d\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.064806 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b08a-account-create-update-bqm5v"] Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.065787 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.067769 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrptn\" (UniqueName: \"kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn\") pod \"neutron-db-create-tld22\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " pod="openstack/neutron-db-create-tld22" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.068382 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.068876 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.073820 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m84zj\" (UniqueName: \"kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj\") pod \"keystone-db-sync-r8hsb\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.075495 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b08a-account-create-update-bqm5v"] Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.149566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqv4\" (UniqueName: \"kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.149721 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.170569 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tld22" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.173031 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.223375 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.251881 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqv4\" (UniqueName: \"kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.251964 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.252665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.254082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.274369 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqv4\" (UniqueName: \"kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4\") pod \"neutron-b08a-account-create-update-bqm5v\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:30 crc kubenswrapper[4744]: I0311 01:15:30.392720 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.618620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"0520adffd7bb3fd0a12c9f2003a6119d39241234f65942bbd93b143144e91dff"} Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.619093 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"661223f28b0201765a8971851fdcdfc8ce86ba64df01908f6086c416839db484"} Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.619104 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"b78b35fad65a6104f3b70ff15a556ecab18834b6ed00582485d71f455ffb4854"} Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.620469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhwh7" event={"ID":"0fbc16eb-59fb-4814-b3b7-944573b75d23","Type":"ContainerStarted","Data":"82fa8eb24907701acba9eb6c5271ab16d4109170acbe1d9ecb398258c67986fb"} Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.642530 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hhwh7" podStartSLOduration=1.866933897 podStartE2EDuration="15.642505083s" podCreationTimestamp="2026-03-11 01:15:16 +0000 UTC" firstStartedPulling="2026-03-11 01:15:17.213606636 +0000 UTC m=+1274.017824251" lastFinishedPulling="2026-03-11 01:15:30.989177832 +0000 UTC m=+1287.793395437" observedRunningTime="2026-03-11 01:15:31.638398565 +0000 UTC m=+1288.442616170" watchObservedRunningTime="2026-03-11 01:15:31.642505083 +0000 UTC m=+1288.446722678" Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.837084 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b08a-account-create-update-bqm5v"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.846123 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vw5rd"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.855069 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b858-account-create-update-hl4l5"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.865135 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qnfhv"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.873083 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5453-account-create-update-69w9d"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.878097 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tld22"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.890827 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7-config-4rg2z"] Mar 11 01:15:31 crc kubenswrapper[4744]: I0311 01:15:31.902960 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hcnrn"] Mar 11 01:15:31 crc kubenswrapper[4744]: W0311 01:15:31.947112 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc55d174a_b4e5_4c03_a180_b93ba3d49f1a.slice/crio-9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07 WatchSource:0}: Error finding container 9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07: Status 404 returned error can't find the container with id 9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07 Mar 11 01:15:31 crc kubenswrapper[4744]: W0311 01:15:31.953722 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9758c278_1bf2_4b51_ac3d_c150415702b9.slice/crio-35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50 WatchSource:0}: Error finding container 35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50: Status 404 returned error can't find the container with id 35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50 Mar 11 01:15:31 crc kubenswrapper[4744]: W0311 01:15:31.956645 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aafed50_ec97_470d_b2b3_ed2984c5bc7e.slice/crio-19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550 WatchSource:0}: Error finding container 19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550: Status 404 returned error can't find the container with id 19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.014482 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-r8hsb"] Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.021677 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2mjl7" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" probeResult="failure" output=< Mar 11 01:15:32 crc kubenswrapper[4744]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 01:15:32 crc kubenswrapper[4744]: > Mar 11 01:15:32 crc kubenswrapper[4744]: W0311 01:15:32.069250 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod810da0cb_5013_4997_84ba_4437bce2a20d.slice/crio-dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644 WatchSource:0}: Error finding container dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644: Status 404 returned error can't find the container with id dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.636727 4744 generic.go:334] "Generic (PLEG): container finished" podID="f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" containerID="29c5e48552c08092a3b4ed7d23f0734524ae1b5ed4a7bab0043222c4c3afdeda" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.637016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qnfhv" event={"ID":"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b","Type":"ContainerDied","Data":"29c5e48552c08092a3b4ed7d23f0734524ae1b5ed4a7bab0043222c4c3afdeda"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.637043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qnfhv" event={"ID":"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b","Type":"ContainerStarted","Data":"71b7c9a8718c820cbc1cba851343a00c68f3fc621fb8a045d819651f1cf731e6"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.638876 4744 generic.go:334] "Generic (PLEG): container finished" podID="c55d174a-b4e5-4c03-a180-b93ba3d49f1a" containerID="a21a680118862f3c8728a36beb870d693dfd1b445663c3f01d10b1a3ddde25ad" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.638980 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5453-account-create-update-69w9d" event={"ID":"c55d174a-b4e5-4c03-a180-b93ba3d49f1a","Type":"ContainerDied","Data":"a21a680118862f3c8728a36beb870d693dfd1b445663c3f01d10b1a3ddde25ad"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.639008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5453-account-create-update-69w9d" event={"ID":"c55d174a-b4e5-4c03-a180-b93ba3d49f1a","Type":"ContainerStarted","Data":"9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.647267 4744 generic.go:334] "Generic (PLEG): container finished" podID="4391b558-59d1-4f5c-8e1e-cbf9667d6544" containerID="ea45ed7a7d4861a79b5a672dec79bdc1b140837ddb6e2ba942b9ea2973d75746" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.647375 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tld22" event={"ID":"4391b558-59d1-4f5c-8e1e-cbf9667d6544","Type":"ContainerDied","Data":"ea45ed7a7d4861a79b5a672dec79bdc1b140837ddb6e2ba942b9ea2973d75746"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.647404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tld22" event={"ID":"4391b558-59d1-4f5c-8e1e-cbf9667d6544","Type":"ContainerStarted","Data":"01f533ad4f4d3e5f017341d6ad580adffcc415d6379c5c73c67c6868f66e7d36"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.650589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r8hsb" event={"ID":"810da0cb-5013-4997-84ba-4437bce2a20d","Type":"ContainerStarted","Data":"dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.652688 4744 generic.go:334] "Generic (PLEG): container finished" podID="9aafed50-ec97-470d-b2b3-ed2984c5bc7e" containerID="0c402c6747514bb3e0ca66a9c9737243dde53fd4f6cf51f17fc335fca19b4742" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.652820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcnrn" event={"ID":"9aafed50-ec97-470d-b2b3-ed2984c5bc7e","Type":"ContainerDied","Data":"0c402c6747514bb3e0ca66a9c9737243dde53fd4f6cf51f17fc335fca19b4742"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.652850 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcnrn" event={"ID":"9aafed50-ec97-470d-b2b3-ed2984c5bc7e","Type":"ContainerStarted","Data":"19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.656341 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"657db772f7e222c18d1d14b7b5c9643c0ec7e79c4adc1d26309d817f501de327"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.657924 4744 generic.go:334] "Generic (PLEG): container finished" podID="4e05323b-e6d6-49f9-8cac-1fa036a98097" containerID="8264dcc7b47b99d9e0f3e4cd1ce2dce69f6b1a59b9f43863fc33c7b8c9be71ff" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.658083 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vw5rd" event={"ID":"4e05323b-e6d6-49f9-8cac-1fa036a98097","Type":"ContainerDied","Data":"8264dcc7b47b99d9e0f3e4cd1ce2dce69f6b1a59b9f43863fc33c7b8c9be71ff"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.658668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vw5rd" event={"ID":"4e05323b-e6d6-49f9-8cac-1fa036a98097","Type":"ContainerStarted","Data":"b6a66d7654a517de65dc14806d0b7f8f7bcd4bb0043ca8e1a028f2cad2d99bd7"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.664204 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-4rg2z" event={"ID":"9758c278-1bf2-4b51-ac3d-c150415702b9","Type":"ContainerStarted","Data":"6599bceff68aad073272a774bd2d861c55ba5068a8f95a4f9574b99febed168a"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.664245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-4rg2z" event={"ID":"9758c278-1bf2-4b51-ac3d-c150415702b9","Type":"ContainerStarted","Data":"35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.670562 4744 generic.go:334] "Generic (PLEG): container finished" podID="74059d02-5e86-4b55-835e-b9dec89b45d3" containerID="e2ed15dda88956f35330c2f6ffc04931080e6522b2c86eb53ef084a7af41baec" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.670784 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b858-account-create-update-hl4l5" event={"ID":"74059d02-5e86-4b55-835e-b9dec89b45d3","Type":"ContainerDied","Data":"e2ed15dda88956f35330c2f6ffc04931080e6522b2c86eb53ef084a7af41baec"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.670826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b858-account-create-update-hl4l5" event={"ID":"74059d02-5e86-4b55-835e-b9dec89b45d3","Type":"ContainerStarted","Data":"fbefc9a0321a715d3f2e20d78ce27d588671d98aa91568d326828763bd5b4e39"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.676779 4744 generic.go:334] "Generic (PLEG): container finished" podID="4891eca2-6c40-4d64-a625-23217932094a" containerID="e3b8eae5e6b10a6863aaecbd738bf9070b2142a94b79196c12273db23b6c7636" exitCode=0 Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.677042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b08a-account-create-update-bqm5v" event={"ID":"4891eca2-6c40-4d64-a625-23217932094a","Type":"ContainerDied","Data":"e3b8eae5e6b10a6863aaecbd738bf9070b2142a94b79196c12273db23b6c7636"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.677095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b08a-account-create-update-bqm5v" event={"ID":"4891eca2-6c40-4d64-a625-23217932094a","Type":"ContainerStarted","Data":"ad58f256b82dd580a65e112d37b80864defa65d2cef8adcc2449ccd172187360"} Mar 11 01:15:32 crc kubenswrapper[4744]: I0311 01:15:32.736881 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mjl7-config-4rg2z" podStartSLOduration=10.736860864 podStartE2EDuration="10.736860864s" podCreationTimestamp="2026-03-11 01:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:32.727566116 +0000 UTC m=+1289.531783721" watchObservedRunningTime="2026-03-11 01:15:32.736860864 +0000 UTC m=+1289.541078479" Mar 11 01:15:33 crc kubenswrapper[4744]: I0311 01:15:33.686111 4744 generic.go:334] "Generic (PLEG): container finished" podID="9758c278-1bf2-4b51-ac3d-c150415702b9" containerID="6599bceff68aad073272a774bd2d861c55ba5068a8f95a4f9574b99febed168a" exitCode=0 Mar 11 01:15:33 crc kubenswrapper[4744]: I0311 01:15:33.686163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-4rg2z" event={"ID":"9758c278-1bf2-4b51-ac3d-c150415702b9","Type":"ContainerDied","Data":"6599bceff68aad073272a774bd2d861c55ba5068a8f95a4f9574b99febed168a"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.724816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcnrn" event={"ID":"9aafed50-ec97-470d-b2b3-ed2984c5bc7e","Type":"ContainerDied","Data":"19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.725456 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b107ac3bd0bf342bea80b906daa95fc79058168a9b7ebeb8546ac99dc83550" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.728438 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b08a-account-create-update-bqm5v" event={"ID":"4891eca2-6c40-4d64-a625-23217932094a","Type":"ContainerDied","Data":"ad58f256b82dd580a65e112d37b80864defa65d2cef8adcc2449ccd172187360"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.728469 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad58f256b82dd580a65e112d37b80864defa65d2cef8adcc2449ccd172187360" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.730183 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qnfhv" event={"ID":"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b","Type":"ContainerDied","Data":"71b7c9a8718c820cbc1cba851343a00c68f3fc621fb8a045d819651f1cf731e6"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.730217 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b7c9a8718c820cbc1cba851343a00c68f3fc621fb8a045d819651f1cf731e6" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.733568 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vw5rd" event={"ID":"4e05323b-e6d6-49f9-8cac-1fa036a98097","Type":"ContainerDied","Data":"b6a66d7654a517de65dc14806d0b7f8f7bcd4bb0043ca8e1a028f2cad2d99bd7"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.733611 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6a66d7654a517de65dc14806d0b7f8f7bcd4bb0043ca8e1a028f2cad2d99bd7" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.736391 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5453-account-create-update-69w9d" event={"ID":"c55d174a-b4e5-4c03-a180-b93ba3d49f1a","Type":"ContainerDied","Data":"9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.736443 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9116038599cfa6c256c0653b860708801dc05a2c0d8027d7f2357036987d6d07" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.738464 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-4rg2z" event={"ID":"9758c278-1bf2-4b51-ac3d-c150415702b9","Type":"ContainerDied","Data":"35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.738493 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a2d75442a7219247fcf81959691adf12040b8b3df1e52001c80675e7998d50" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.741280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tld22" event={"ID":"4391b558-59d1-4f5c-8e1e-cbf9667d6544","Type":"ContainerDied","Data":"01f533ad4f4d3e5f017341d6ad580adffcc415d6379c5c73c67c6868f66e7d36"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.741345 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f533ad4f4d3e5f017341d6ad580adffcc415d6379c5c73c67c6868f66e7d36" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.743695 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b858-account-create-update-hl4l5" event={"ID":"74059d02-5e86-4b55-835e-b9dec89b45d3","Type":"ContainerDied","Data":"fbefc9a0321a715d3f2e20d78ce27d588671d98aa91568d326828763bd5b4e39"} Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.743736 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbefc9a0321a715d3f2e20d78ce27d588671d98aa91568d326828763bd5b4e39" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.808676 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.835126 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.858750 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tld22" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.860247 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.896754 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72tpk\" (UniqueName: \"kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk\") pod \"74059d02-5e86-4b55-835e-b9dec89b45d3\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.896803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.896835 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdqv4\" (UniqueName: \"kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4\") pod \"4891eca2-6c40-4d64-a625-23217932094a\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.896867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts\") pod \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897060 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run" (OuterVolumeSpecName: "var-run") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897264 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897328 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts\") pod \"4891eca2-6c40-4d64-a625-23217932094a\" (UID: \"4891eca2-6c40-4d64-a625-23217932094a\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897345 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897370 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts\") pod \"74059d02-5e86-4b55-835e-b9dec89b45d3\" (UID: \"74059d02-5e86-4b55-835e-b9dec89b45d3\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897414 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897456 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897471 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt4lf\" (UniqueName: \"kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf\") pod \"9758c278-1bf2-4b51-ac3d-c150415702b9\" (UID: \"9758c278-1bf2-4b51-ac3d-c150415702b9\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.897552 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrptn\" (UniqueName: \"kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn\") pod \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\" (UID: \"4391b558-59d1-4f5c-8e1e-cbf9667d6544\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898071 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898148 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4891eca2-6c40-4d64-a625-23217932094a" (UID: "4891eca2-6c40-4d64-a625-23217932094a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898331 4744 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898384 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898394 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4391b558-59d1-4f5c-8e1e-cbf9667d6544" (UID: "4391b558-59d1-4f5c-8e1e-cbf9667d6544"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898404 4744 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898419 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4891eca2-6c40-4d64-a625-23217932094a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898434 4744 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9758c278-1bf2-4b51-ac3d-c150415702b9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898429 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74059d02-5e86-4b55-835e-b9dec89b45d3" (UID: "74059d02-5e86-4b55-835e-b9dec89b45d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.898867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts" (OuterVolumeSpecName: "scripts") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.899882 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.903731 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn" (OuterVolumeSpecName: "kube-api-access-vrptn") pod "4391b558-59d1-4f5c-8e1e-cbf9667d6544" (UID: "4391b558-59d1-4f5c-8e1e-cbf9667d6544"). InnerVolumeSpecName "kube-api-access-vrptn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.904153 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.909791 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.909845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk" (OuterVolumeSpecName: "kube-api-access-72tpk") pod "74059d02-5e86-4b55-835e-b9dec89b45d3" (UID: "74059d02-5e86-4b55-835e-b9dec89b45d3"). InnerVolumeSpecName "kube-api-access-72tpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.911286 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf" (OuterVolumeSpecName: "kube-api-access-jt4lf") pod "9758c278-1bf2-4b51-ac3d-c150415702b9" (UID: "9758c278-1bf2-4b51-ac3d-c150415702b9"). InnerVolumeSpecName "kube-api-access-jt4lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.911952 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4" (OuterVolumeSpecName: "kube-api-access-fdqv4") pod "4891eca2-6c40-4d64-a625-23217932094a" (UID: "4891eca2-6c40-4d64-a625-23217932094a"). InnerVolumeSpecName "kube-api-access-fdqv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.919718 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999235 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb9g8\" (UniqueName: \"kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8\") pod \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999298 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts\") pod \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\" (UID: \"c55d174a-b4e5-4c03-a180-b93ba3d49f1a\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999318 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts\") pod \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999337 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45n5w\" (UniqueName: \"kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w\") pod \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts\") pod \"4e05323b-e6d6-49f9-8cac-1fa036a98097\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999415 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkdtg\" (UniqueName: \"kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg\") pod \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\" (UID: \"f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999429 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts\") pod \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\" (UID: \"9aafed50-ec97-470d-b2b3-ed2984c5bc7e\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999458 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scn7d\" (UniqueName: \"kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d\") pod \"4e05323b-e6d6-49f9-8cac-1fa036a98097\" (UID: \"4e05323b-e6d6-49f9-8cac-1fa036a98097\") " Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999728 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74059d02-5e86-4b55-835e-b9dec89b45d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999741 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9758c278-1bf2-4b51-ac3d-c150415702b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999750 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt4lf\" (UniqueName: \"kubernetes.io/projected/9758c278-1bf2-4b51-ac3d-c150415702b9-kube-api-access-jt4lf\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999760 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrptn\" (UniqueName: \"kubernetes.io/projected/4391b558-59d1-4f5c-8e1e-cbf9667d6544-kube-api-access-vrptn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999769 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72tpk\" (UniqueName: \"kubernetes.io/projected/74059d02-5e86-4b55-835e-b9dec89b45d3-kube-api-access-72tpk\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999778 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdqv4\" (UniqueName: \"kubernetes.io/projected/4891eca2-6c40-4d64-a625-23217932094a-kube-api-access-fdqv4\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:36 crc kubenswrapper[4744]: I0311 01:15:36.999787 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4391b558-59d1-4f5c-8e1e-cbf9667d6544-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.002121 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9aafed50-ec97-470d-b2b3-ed2984c5bc7e" (UID: "9aafed50-ec97-470d-b2b3-ed2984c5bc7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.002414 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e05323b-e6d6-49f9-8cac-1fa036a98097" (UID: "4e05323b-e6d6-49f9-8cac-1fa036a98097"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.002454 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c55d174a-b4e5-4c03-a180-b93ba3d49f1a" (UID: "c55d174a-b4e5-4c03-a180-b93ba3d49f1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.002773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" (UID: "f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.005624 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d" (OuterVolumeSpecName: "kube-api-access-scn7d") pod "4e05323b-e6d6-49f9-8cac-1fa036a98097" (UID: "4e05323b-e6d6-49f9-8cac-1fa036a98097"). InnerVolumeSpecName "kube-api-access-scn7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.008113 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w" (OuterVolumeSpecName: "kube-api-access-45n5w") pod "9aafed50-ec97-470d-b2b3-ed2984c5bc7e" (UID: "9aafed50-ec97-470d-b2b3-ed2984c5bc7e"). InnerVolumeSpecName "kube-api-access-45n5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.010678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2mjl7" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.011900 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg" (OuterVolumeSpecName: "kube-api-access-fkdtg") pod "f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" (UID: "f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b"). InnerVolumeSpecName "kube-api-access-fkdtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.013432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8" (OuterVolumeSpecName: "kube-api-access-nb9g8") pod "c55d174a-b4e5-4c03-a180-b93ba3d49f1a" (UID: "c55d174a-b4e5-4c03-a180-b93ba3d49f1a"). InnerVolumeSpecName "kube-api-access-nb9g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.102984 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scn7d\" (UniqueName: \"kubernetes.io/projected/4e05323b-e6d6-49f9-8cac-1fa036a98097-kube-api-access-scn7d\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103233 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb9g8\" (UniqueName: \"kubernetes.io/projected/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-kube-api-access-nb9g8\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103244 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55d174a-b4e5-4c03-a180-b93ba3d49f1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103254 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103264 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45n5w\" (UniqueName: \"kubernetes.io/projected/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-kube-api-access-45n5w\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103274 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e05323b-e6d6-49f9-8cac-1fa036a98097-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103283 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkdtg\" (UniqueName: \"kubernetes.io/projected/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b-kube-api-access-fkdtg\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.103291 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aafed50-ec97-470d-b2b3-ed2984c5bc7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.759344 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"7d4d68ba9b886d9742d463d33fa1cc87d5cbb6630ca1df77c63ccccdfc56e184"} Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.759390 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"2a2081a399521c0c43979716a406e0e99df32e512b18971fd342f84cf6c0c784"} Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.759401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"2b4eef62494e9560a6468f7258be6cbee2afabc30627c4cd424f6256bf882bd9"} Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.759410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"d0f959b12a9512cb3f5a0776eb000de08da42f09e11825b7a95d1b85cdeb9533"} Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.761706 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vw5rd" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.761723 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r8hsb" event={"ID":"810da0cb-5013-4997-84ba-4437bce2a20d","Type":"ContainerStarted","Data":"53dabf5222da9591bfd90f74338a980cda9124c6f3b99ab148f16d2ce8f3d265"} Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.761753 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcnrn" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762010 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-4rg2z" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762084 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tld22" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-bqm5v" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762119 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b858-account-create-update-hl4l5" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762423 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-69w9d" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.762699 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qnfhv" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.814505 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-r8hsb" podStartSLOduration=4.212070314 podStartE2EDuration="8.814331889s" podCreationTimestamp="2026-03-11 01:15:29 +0000 UTC" firstStartedPulling="2026-03-11 01:15:32.074411281 +0000 UTC m=+1288.878628886" lastFinishedPulling="2026-03-11 01:15:36.676672816 +0000 UTC m=+1293.480890461" observedRunningTime="2026-03-11 01:15:37.805357901 +0000 UTC m=+1294.609575526" watchObservedRunningTime="2026-03-11 01:15:37.814331889 +0000 UTC m=+1294.618549514" Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.925195 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mjl7-config-4rg2z"] Mar 11 01:15:37 crc kubenswrapper[4744]: I0311 01:15:37.939503 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mjl7-config-4rg2z"] Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.005694 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9758c278-1bf2-4b51-ac3d-c150415702b9" path="/var/lib/kubelet/pods/9758c278-1bf2-4b51-ac3d-c150415702b9/volumes" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164135 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mjl7-config-ln6qr"] Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164574 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55d174a-b4e5-4c03-a180-b93ba3d49f1a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164596 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55d174a-b4e5-4c03-a180-b93ba3d49f1a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164615 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4391b558-59d1-4f5c-8e1e-cbf9667d6544" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164624 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4391b558-59d1-4f5c-8e1e-cbf9667d6544" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164642 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aafed50-ec97-470d-b2b3-ed2984c5bc7e" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164651 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aafed50-ec97-470d-b2b3-ed2984c5bc7e" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164666 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74059d02-5e86-4b55-835e-b9dec89b45d3" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164674 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="74059d02-5e86-4b55-835e-b9dec89b45d3" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164685 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e05323b-e6d6-49f9-8cac-1fa036a98097" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164692 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e05323b-e6d6-49f9-8cac-1fa036a98097" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164713 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4891eca2-6c40-4d64-a625-23217932094a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164721 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4891eca2-6c40-4d64-a625-23217932094a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164734 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164742 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: E0311 01:15:38.164760 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9758c278-1bf2-4b51-ac3d-c150415702b9" containerName="ovn-config" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164768 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9758c278-1bf2-4b51-ac3d-c150415702b9" containerName="ovn-config" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164970 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="74059d02-5e86-4b55-835e-b9dec89b45d3" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164982 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4891eca2-6c40-4d64-a625-23217932094a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.164993 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55d174a-b4e5-4c03-a180-b93ba3d49f1a" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165009 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e05323b-e6d6-49f9-8cac-1fa036a98097" containerName="mariadb-account-create-update" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165020 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165031 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9758c278-1bf2-4b51-ac3d-c150415702b9" containerName="ovn-config" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165045 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4391b558-59d1-4f5c-8e1e-cbf9667d6544" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165058 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aafed50-ec97-470d-b2b3-ed2984c5bc7e" containerName="mariadb-database-create" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.165669 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.171050 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.187117 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7-config-ln6qr"] Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.223626 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.223778 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.223930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.223983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpk7h\" (UniqueName: \"kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.224048 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.224094 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.324867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325117 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325221 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpk7h\" (UniqueName: \"kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325300 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325664 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325678 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325682 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.325930 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.327665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.344468 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpk7h\" (UniqueName: \"kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h\") pod \"ovn-controller-2mjl7-config-ln6qr\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:38 crc kubenswrapper[4744]: I0311 01:15:38.483333 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:39 crc kubenswrapper[4744]: I0311 01:15:39.157122 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mjl7-config-ln6qr"] Mar 11 01:15:39 crc kubenswrapper[4744]: I0311 01:15:39.800386 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-ln6qr" event={"ID":"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0","Type":"ContainerStarted","Data":"b58c17c7fbca0afe944a3b0c5abbdbab6799350618ef2c39e295ce53f856dead"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.819585 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"57f15e76688f17356669b85c46315dd0dffc814d43f7f2c52af90d0784301949"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.820267 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"683071c11cad26f904a5f7a36fcb33b138ae8c1681f0a7ff0d1c59caa91adfa5"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.820278 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"7325adf5421afd7c3a21ffc84f42f4176496bf0df0f8bbb48940ea537474b52d"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.820288 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"1dc3103a150112ffa802284194bdc5ad25c73127fe6b5ddb013e5409a1028b69"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.820297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"98bb403c28de1a3a422f752fd836eeaf91ab8123e3a1415917dfa6a935d3dad7"} Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.831496 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" containerID="48d37240544df639a45e04fbd68879b9b5bfc51393609cb1b20a53fa838c61f3" exitCode=0 Mar 11 01:15:40 crc kubenswrapper[4744]: I0311 01:15:40.831558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-ln6qr" event={"ID":"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0","Type":"ContainerDied","Data":"48d37240544df639a45e04fbd68879b9b5bfc51393609cb1b20a53fa838c61f3"} Mar 11 01:15:41 crc kubenswrapper[4744]: I0311 01:15:41.845430 4744 generic.go:334] "Generic (PLEG): container finished" podID="810da0cb-5013-4997-84ba-4437bce2a20d" containerID="53dabf5222da9591bfd90f74338a980cda9124c6f3b99ab148f16d2ce8f3d265" exitCode=0 Mar 11 01:15:41 crc kubenswrapper[4744]: I0311 01:15:41.845555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r8hsb" event={"ID":"810da0cb-5013-4997-84ba-4437bce2a20d","Type":"ContainerDied","Data":"53dabf5222da9591bfd90f74338a980cda9124c6f3b99ab148f16d2ce8f3d265"} Mar 11 01:15:41 crc kubenswrapper[4744]: I0311 01:15:41.857249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"25e3543d3b3d14862a73ba6ffdaeec7e7e8cb26ca742becd4587bae22c3b8432"} Mar 11 01:15:41 crc kubenswrapper[4744]: I0311 01:15:41.857319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerStarted","Data":"5bbb97d3f04c59bd0734d58c134b140d749bfb617a14047f4d2be2a03bcf9bd5"} Mar 11 01:15:41 crc kubenswrapper[4744]: I0311 01:15:41.949798 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.914841907 podStartE2EDuration="38.949773165s" podCreationTimestamp="2026-03-11 01:15:03 +0000 UTC" firstStartedPulling="2026-03-11 01:15:20.68861833 +0000 UTC m=+1277.492835945" lastFinishedPulling="2026-03-11 01:15:39.723549608 +0000 UTC m=+1296.527767203" observedRunningTime="2026-03-11 01:15:41.932202443 +0000 UTC m=+1298.736420048" watchObservedRunningTime="2026-03-11 01:15:41.949773165 +0000 UTC m=+1298.753990770" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.298732 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.300009 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.301342 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.308118 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.349810 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.391905 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.392219 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.392242 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.392291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxvc\" (UniqueName: \"kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.392313 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.392379 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: E0311 01:15:42.394985 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fbc16eb_59fb_4814_b3b7_944573b75d23.slice/crio-conmon-82fa8eb24907701acba9eb6c5271ab16d4109170acbe1d9ecb398258c67986fb.scope\": RecentStats: unable to find data in memory cache]" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.492982 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493029 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpk7h\" (UniqueName: \"kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493060 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493064 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run" (OuterVolumeSpecName: "var-run") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493096 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493114 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493173 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493205 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts\") pod \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\" (UID: \"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0\") " Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thxvc\" (UniqueName: \"kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493548 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493672 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493722 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493733 4744 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493741 4744 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.493964 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.494217 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts" (OuterVolumeSpecName: "scripts") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.494714 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.494827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.494846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.495021 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.495278 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.500895 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h" (OuterVolumeSpecName: "kube-api-access-bpk7h") pod "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" (UID: "a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0"). InnerVolumeSpecName "kube-api-access-bpk7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.509691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxvc\" (UniqueName: \"kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc\") pod \"dnsmasq-dns-58f8d8dcc-n7kcc\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.595891 4744 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.595946 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.595966 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpk7h\" (UniqueName: \"kubernetes.io/projected/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0-kube-api-access-bpk7h\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.660482 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.872826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7-config-ln6qr" event={"ID":"a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0","Type":"ContainerDied","Data":"b58c17c7fbca0afe944a3b0c5abbdbab6799350618ef2c39e295ce53f856dead"} Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.873124 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7-config-ln6qr" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.873143 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b58c17c7fbca0afe944a3b0c5abbdbab6799350618ef2c39e295ce53f856dead" Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.874694 4744 generic.go:334] "Generic (PLEG): container finished" podID="0fbc16eb-59fb-4814-b3b7-944573b75d23" containerID="82fa8eb24907701acba9eb6c5271ab16d4109170acbe1d9ecb398258c67986fb" exitCode=0 Mar 11 01:15:42 crc kubenswrapper[4744]: I0311 01:15:42.874890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhwh7" event={"ID":"0fbc16eb-59fb-4814-b3b7-944573b75d23","Type":"ContainerDied","Data":"82fa8eb24907701acba9eb6c5271ab16d4109170acbe1d9ecb398258c67986fb"} Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.135554 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:43 crc kubenswrapper[4744]: W0311 01:15:43.141232 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22caad75_2e3e_4fbc_8358_7bab36543b23.slice/crio-7f8e2b48572f30c647789cdaf2f76dee341d80b14df4f7a97df3aea9e690878d WatchSource:0}: Error finding container 7f8e2b48572f30c647789cdaf2f76dee341d80b14df4f7a97df3aea9e690878d: Status 404 returned error can't find the container with id 7f8e2b48572f30c647789cdaf2f76dee341d80b14df4f7a97df3aea9e690878d Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.230228 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.305377 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle\") pod \"810da0cb-5013-4997-84ba-4437bce2a20d\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.305468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m84zj\" (UniqueName: \"kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj\") pod \"810da0cb-5013-4997-84ba-4437bce2a20d\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.305504 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data\") pod \"810da0cb-5013-4997-84ba-4437bce2a20d\" (UID: \"810da0cb-5013-4997-84ba-4437bce2a20d\") " Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.313868 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj" (OuterVolumeSpecName: "kube-api-access-m84zj") pod "810da0cb-5013-4997-84ba-4437bce2a20d" (UID: "810da0cb-5013-4997-84ba-4437bce2a20d"). InnerVolumeSpecName "kube-api-access-m84zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.399280 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "810da0cb-5013-4997-84ba-4437bce2a20d" (UID: "810da0cb-5013-4997-84ba-4437bce2a20d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.399814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data" (OuterVolumeSpecName: "config-data") pod "810da0cb-5013-4997-84ba-4437bce2a20d" (UID: "810da0cb-5013-4997-84ba-4437bce2a20d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.407374 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m84zj\" (UniqueName: \"kubernetes.io/projected/810da0cb-5013-4997-84ba-4437bce2a20d-kube-api-access-m84zj\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.407403 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.407413 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810da0cb-5013-4997-84ba-4437bce2a20d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.422801 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mjl7-config-ln6qr"] Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.435246 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mjl7-config-ln6qr"] Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.886753 4744 generic.go:334] "Generic (PLEG): container finished" podID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerID="2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40" exitCode=0 Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.886866 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" event={"ID":"22caad75-2e3e-4fbc-8358-7bab36543b23","Type":"ContainerDied","Data":"2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40"} Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.886899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" event={"ID":"22caad75-2e3e-4fbc-8358-7bab36543b23","Type":"ContainerStarted","Data":"7f8e2b48572f30c647789cdaf2f76dee341d80b14df4f7a97df3aea9e690878d"} Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.893995 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.895563 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r8hsb" event={"ID":"810da0cb-5013-4997-84ba-4437bce2a20d","Type":"ContainerDied","Data":"dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644"} Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.895627 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4501a53033afaf57b8e26ff67864de843ce8b07e7de7f7514c1ece7ffbe644" Mar 11 01:15:43 crc kubenswrapper[4744]: I0311 01:15:43.989106 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" path="/var/lib/kubelet/pods/a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0/volumes" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.133720 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.161909 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:44 crc kubenswrapper[4744]: E0311 01:15:44.162344 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810da0cb-5013-4997-84ba-4437bce2a20d" containerName="keystone-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.162366 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="810da0cb-5013-4997-84ba-4437bce2a20d" containerName="keystone-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: E0311 01:15:44.162383 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" containerName="ovn-config" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.162391 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" containerName="ovn-config" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.162585 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bfb5c8-67f1-4cc5-ad35-e62b9e46d8a0" containerName="ovn-config" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.162616 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="810da0cb-5013-4997-84ba-4437bce2a20d" containerName="keystone-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.163440 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.177003 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.198600 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9ddjj"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.200665 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.206888 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.207082 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.207326 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.207451 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.207616 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r54bc" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230349 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230394 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnpn\" (UniqueName: \"kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230420 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230471 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230496 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.230541 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.237352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9ddjj"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331529 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331781 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331818 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331834 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331856 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331899 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331918 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331934 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnpn\" (UniqueName: \"kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.331974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.332003 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.332029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfxkw\" (UniqueName: \"kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.332373 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.332833 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.332999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.333023 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.333393 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.353739 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnpn\" (UniqueName: \"kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn\") pod \"dnsmasq-dns-679ccc59c7-vrk9j\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.374187 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hqbzq"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.375187 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.379106 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nwhd4" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.379334 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.383541 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.395039 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hqbzq"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.421616 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g9cnb"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.422674 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g9cnb"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.422753 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.425136 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mhb2h" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.425337 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.425490 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433029 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433280 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxkw\" (UniqueName: \"kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.433381 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.439433 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.442693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.443087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.443184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.456543 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.473464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxkw\" (UniqueName: \"kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw\") pod \"keystone-bootstrap-9ddjj\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.476161 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.500250 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.507802 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vkz2r"] Mar 11 01:15:44 crc kubenswrapper[4744]: E0311 01:15:44.508350 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbc16eb-59fb-4814-b3b7-944573b75d23" containerName="glance-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.508366 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbc16eb-59fb-4814-b3b7-944573b75d23" containerName="glance-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.508532 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbc16eb-59fb-4814-b3b7-944573b75d23" containerName="glance-db-sync" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.509050 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.511716 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.528981 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.529157 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ssj5" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536591 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536665 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536694 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gc2c\" (UniqueName: \"kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536738 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536776 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr9gk\" (UniqueName: \"kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.536815 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.542072 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vkz2r"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.548873 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.568450 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-qkr98"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.569405 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.594391 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.595525 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-td6zl" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.595756 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.639596 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640113 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data\") pod \"0fbc16eb-59fb-4814-b3b7-944573b75d23\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640237 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle\") pod \"0fbc16eb-59fb-4814-b3b7-944573b75d23\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640266 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzl7v\" (UniqueName: \"kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v\") pod \"0fbc16eb-59fb-4814-b3b7-944573b75d23\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640296 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data\") pod \"0fbc16eb-59fb-4814-b3b7-944573b75d23\" (UID: \"0fbc16eb-59fb-4814-b3b7-944573b75d23\") " Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640562 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640584 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640609 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdcj\" (UniqueName: \"kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gc2c\" (UniqueName: \"kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640735 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640817 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640834 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640856 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640875 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr9gk\" (UniqueName: \"kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640908 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsk54\" (UniqueName: \"kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.640934 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.649789 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.654138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.661908 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qkr98"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.674313 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.678722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.680901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.684421 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.709478 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.711703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.714712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0fbc16eb-59fb-4814-b3b7-944573b75d23" (UID: "0fbc16eb-59fb-4814-b3b7-944573b75d23"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.716362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr9gk\" (UniqueName: \"kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk\") pod \"neutron-db-sync-g9cnb\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.719868 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v" (OuterVolumeSpecName: "kube-api-access-tzl7v") pod "0fbc16eb-59fb-4814-b3b7-944573b75d23" (UID: "0fbc16eb-59fb-4814-b3b7-944573b75d23"). InnerVolumeSpecName "kube-api-access-tzl7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.721663 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.731846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.737661 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fbc16eb-59fb-4814-b3b7-944573b75d23" (UID: "0fbc16eb-59fb-4814-b3b7-944573b75d23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.738231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gc2c\" (UniqueName: \"kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c\") pod \"cinder-db-sync-hqbzq\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.740433 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.740526 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.745109 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748341 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57c2\" (UniqueName: \"kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748460 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdcj\" (UniqueName: \"kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748575 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748665 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748777 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748878 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.748964 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.749086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.749197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.750990 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsk54\" (UniqueName: \"kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751245 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751348 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751403 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzl7v\" (UniqueName: \"kubernetes.io/projected/0fbc16eb-59fb-4814-b3b7-944573b75d23-kube-api-access-tzl7v\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.751471 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.749878 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.762713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.763071 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.782991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.789640 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.803207 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.803342 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.803587 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.803660 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsk54\" (UniqueName: \"kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54\") pod \"placement-db-sync-qkr98\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.808532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdcj\" (UniqueName: \"kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj\") pod \"barbican-db-sync-vkz2r\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.831916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data" (OuterVolumeSpecName: "config-data") pod "0fbc16eb-59fb-4814-b3b7-944573b75d23" (UID: "0fbc16eb-59fb-4814-b3b7-944573b75d23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.858981 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859027 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859063 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859094 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57c2\" (UniqueName: \"kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859172 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859186 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859204 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859295 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmdx\" (UniqueName: \"kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859328 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.859369 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbc16eb-59fb-4814-b3b7-944573b75d23-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.860321 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.862244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.862503 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.862801 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.863162 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.865147 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.910059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.917614 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57c2\" (UniqueName: \"kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2\") pod \"dnsmasq-dns-86cdbd9bcc-vrldq\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.931286 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qkr98" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.963613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.976535 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.976803 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.976946 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmdx\" (UniqueName: \"kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.977041 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.977110 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.977196 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.977475 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.980217 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.980445 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.980561 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.982380 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:44 crc kubenswrapper[4744]: I0311 01:15:44.989282 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.005622 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.011263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.029128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmdx\" (UniqueName: \"kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx\") pod \"ceilometer-0\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " pod="openstack/ceilometer-0" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.063259 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhwh7" event={"ID":"0fbc16eb-59fb-4814-b3b7-944573b75d23","Type":"ContainerDied","Data":"4b8fdeb82a144189474c9344c568e70bd5ae1f8f180eba0b100e431de8389884"} Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.063296 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8fdeb82a144189474c9344c568e70bd5ae1f8f180eba0b100e431de8389884" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.063360 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhwh7" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.095886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" event={"ID":"22caad75-2e3e-4fbc-8358-7bab36543b23","Type":"ContainerStarted","Data":"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772"} Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.096305 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="dnsmasq-dns" containerID="cri-o://58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772" gracePeriod=10 Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.096638 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.154990 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" podStartSLOduration=3.154968657 podStartE2EDuration="3.154968657s" podCreationTimestamp="2026-03-11 01:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:45.148218129 +0000 UTC m=+1301.952435734" watchObservedRunningTime="2026-03-11 01:15:45.154968657 +0000 UTC m=+1301.959186262" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.259783 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.302028 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.369406 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.407614 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.409000 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.424163 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487727 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7vwx\" (UniqueName: \"kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487793 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.487879 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: W0311 01:15:45.490453 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f8dfaaa_9a7f_4693_8f45_a4a84123b714.slice/crio-073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8 WatchSource:0}: Error finding container 073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8: Status 404 returned error can't find the container with id 073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8 Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.502699 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9ddjj"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.579165 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g9cnb"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592347 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592487 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592668 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7vwx\" (UniqueName: \"kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.592768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.594054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.594126 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.594457 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.595950 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.596357 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.624498 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7vwx\" (UniqueName: \"kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx\") pod \"dnsmasq-dns-6478444fbc-pgkhm\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.742602 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qkr98"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.766107 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.793609 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hqbzq"] Mar 11 01:15:45 crc kubenswrapper[4744]: I0311 01:15:45.994912 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vkz2r"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.009408 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:46 crc kubenswrapper[4744]: W0311 01:15:46.010716 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bd4d017_1f55_468a_8a09_472f77929440.slice/crio-4781bd965fb6ba0743daefc46730cf87a183e95beb94d62eab7681be8ba9eb11 WatchSource:0}: Error finding container 4781bd965fb6ba0743daefc46730cf87a183e95beb94d62eab7681be8ba9eb11: Status 404 returned error can't find the container with id 4781bd965fb6ba0743daefc46730cf87a183e95beb94d62eab7681be8ba9eb11 Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.011300 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.086915 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:15:46 crc kubenswrapper[4744]: W0311 01:15:46.088894 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddeba13d_7885_44ef_8454_1a7b6ef48303.slice/crio-2b9835e17acd5d4021880cc0cc42522b679209520e6b88d73111082e68e4d77f WatchSource:0}: Error finding container 2b9835e17acd5d4021880cc0cc42522b679209520e6b88d73111082e68e4d77f: Status 404 returned error can't find the container with id 2b9835e17acd5d4021880cc0cc42522b679209520e6b88d73111082e68e4d77f Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109349 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109441 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109542 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109777 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thxvc\" (UniqueName: \"kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.109848 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config\") pod \"22caad75-2e3e-4fbc-8358-7bab36543b23\" (UID: \"22caad75-2e3e-4fbc-8358-7bab36543b23\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.116221 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc" (OuterVolumeSpecName: "kube-api-access-thxvc") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "kube-api-access-thxvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.153823 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.154283 4744 generic.go:334] "Generic (PLEG): container finished" podID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerID="58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772" exitCode=0 Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.154399 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.154919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" event={"ID":"22caad75-2e3e-4fbc-8358-7bab36543b23","Type":"ContainerDied","Data":"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.154969 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-n7kcc" event={"ID":"22caad75-2e3e-4fbc-8358-7bab36543b23","Type":"ContainerDied","Data":"7f8e2b48572f30c647789cdaf2f76dee341d80b14df4f7a97df3aea9e690878d"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.154987 4744 scope.go:117] "RemoveContainer" containerID="58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.158034 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" event={"ID":"3bd4d017-1f55-468a-8a09-472f77929440","Type":"ContainerStarted","Data":"4781bd965fb6ba0743daefc46730cf87a183e95beb94d62eab7681be8ba9eb11"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.166685 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9ddjj" event={"ID":"1f8dfaaa-9a7f-4693-8f45-a4a84123b714","Type":"ContainerStarted","Data":"119366017deed13d5411b21ac7aa67584fa8eee5670f556b1938ccb084cb7a63"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.166752 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9ddjj" event={"ID":"1f8dfaaa-9a7f-4693-8f45-a4a84123b714","Type":"ContainerStarted","Data":"073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.177393 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config" (OuterVolumeSpecName: "config") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.190381 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqbzq" event={"ID":"5df37d98-3dbc-4977-add0-525bda3d679b","Type":"ContainerStarted","Data":"b3eebb7973fbdf8d377cc2adc2dacbd7c35395418b497fb08767391987764293"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.190976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.213452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.213898 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9ddjj" podStartSLOduration=2.213883585 podStartE2EDuration="2.213883585s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:46.20691211 +0000 UTC m=+1303.011129715" watchObservedRunningTime="2026-03-11 01:15:46.213883585 +0000 UTC m=+1303.018101190" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.214614 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thxvc\" (UniqueName: \"kubernetes.io/projected/22caad75-2e3e-4fbc-8358-7bab36543b23-kube-api-access-thxvc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.214627 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.214635 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.214644 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.214652 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.226161 4744 generic.go:334] "Generic (PLEG): container finished" podID="59fa9917-a1d9-4554-afec-483df5e5f3b5" containerID="8d47681840034b94af37e8bc246fa03461bdac0595eaf76fe0a850d6cd0b0f78" exitCode=0 Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.226249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" event={"ID":"59fa9917-a1d9-4554-afec-483df5e5f3b5","Type":"ContainerDied","Data":"8d47681840034b94af37e8bc246fa03461bdac0595eaf76fe0a850d6cd0b0f78"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.226276 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" event={"ID":"59fa9917-a1d9-4554-afec-483df5e5f3b5","Type":"ContainerStarted","Data":"c3f81f7a7dbcc7a27db6fdcfc1aaa89489bd7961e99520066f795f686845ff29"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.238986 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g9cnb" event={"ID":"b09baca5-8198-404f-8b8d-8f58db34f975","Type":"ContainerStarted","Data":"fb286387721bec0a1f91359d2c30c79ca49f4b4898995988b7e9e90f9a29caf6"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.239032 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g9cnb" event={"ID":"b09baca5-8198-404f-8b8d-8f58db34f975","Type":"ContainerStarted","Data":"b2229f531a05351ec8ef5d94ef9c68a0310b77480e11401093d638e4a8d74c27"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.239788 4744 scope.go:117] "RemoveContainer" containerID="2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.253387 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:15:46 crc kubenswrapper[4744]: E0311 01:15:46.253771 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="dnsmasq-dns" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.253782 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="dnsmasq-dns" Mar 11 01:15:46 crc kubenswrapper[4744]: E0311 01:15:46.253801 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="init" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.253807 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="init" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.253956 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" containerName="dnsmasq-dns" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.254988 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.256688 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mfg2r" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.256322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vkz2r" event={"ID":"64205e5d-2853-49f7-9928-8362fc9210ea","Type":"ContainerStarted","Data":"66bcdf1c8eea36678580c89ede8345badf342926555db788ebe38be83b2b832b"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.257023 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.258018 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.270120 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.270253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerStarted","Data":"2b9835e17acd5d4021880cc0cc42522b679209520e6b88d73111082e68e4d77f"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.288429 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22caad75-2e3e-4fbc-8358-7bab36543b23" (UID: "22caad75-2e3e-4fbc-8358-7bab36543b23"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.289216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qkr98" event={"ID":"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8","Type":"ContainerStarted","Data":"1dc27ff56af04f555f2294d07d8c0bb0f029e94419e3554c55f375bd61a9c0eb"} Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.300425 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g9cnb" podStartSLOduration=2.30040783 podStartE2EDuration="2.30040783s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:46.280643779 +0000 UTC m=+1303.084861384" watchObservedRunningTime="2026-03-11 01:15:46.30040783 +0000 UTC m=+1303.104625435" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.309484 4744 scope.go:117] "RemoveContainer" containerID="58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772" Mar 11 01:15:46 crc kubenswrapper[4744]: E0311 01:15:46.310469 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772\": container with ID starting with 58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772 not found: ID does not exist" containerID="58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.310502 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772"} err="failed to get container status \"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772\": rpc error: code = NotFound desc = could not find container \"58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772\": container with ID starting with 58542569e5176dd767e5ba5362061d575294728c5b1d8bbe8c9789c2fa0f6772 not found: ID does not exist" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.310532 4744 scope.go:117] "RemoveContainer" containerID="2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40" Mar 11 01:15:46 crc kubenswrapper[4744]: E0311 01:15:46.311159 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40\": container with ID starting with 2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40 not found: ID does not exist" containerID="2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.311183 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40"} err="failed to get container status \"2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40\": rpc error: code = NotFound desc = could not find container \"2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40\": container with ID starting with 2a7db405ac9df3a4c23d4e8c8104569f0851b014eb0c768950c4f5c133c95c40 not found: ID does not exist" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316298 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316393 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316445 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56gx\" (UniqueName: \"kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316688 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316743 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.316794 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22caad75-2e3e-4fbc-8358-7bab36543b23-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.351452 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.420720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421444 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421487 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r56gx\" (UniqueName: \"kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.421539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.424033 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.424303 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.424986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.426638 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.427866 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.428126 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.451522 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r56gx\" (UniqueName: \"kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.472011 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.502156 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.521616 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-n7kcc"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.541640 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.543102 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.543893 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.546533 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.601072 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.614904 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633185 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633208 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633251 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8gvf\" (UniqueName: \"kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633292 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.633354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.734389 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.734760 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.734791 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.735420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.735674 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.735821 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnpn\" (UniqueName: \"kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736146 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8gvf\" (UniqueName: \"kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736218 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736307 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.736404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.737679 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.737821 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.737977 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.745499 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.752713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn" (OuterVolumeSpecName: "kube-api-access-ttnpn") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "kube-api-access-ttnpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.753082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.780476 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.795568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8gvf\" (UniqueName: \"kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.822502 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.838228 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.838855 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") pod \"59fa9917-a1d9-4554-afec-483df5e5f3b5\" (UID: \"59fa9917-a1d9-4554-afec-483df5e5f3b5\") " Mar 11 01:15:46 crc kubenswrapper[4744]: W0311 01:15:46.840469 4744 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/59fa9917-a1d9-4554-afec-483df5e5f3b5/volumes/kubernetes.io~configmap/dns-svc Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.840500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.842305 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.842330 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnpn\" (UniqueName: \"kubernetes.io/projected/59fa9917-a1d9-4554-afec-483df5e5f3b5-kube-api-access-ttnpn\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.842342 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.845903 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.847957 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.852848 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config" (OuterVolumeSpecName: "config") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.875193 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.877816 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59fa9917-a1d9-4554-afec-483df5e5f3b5" (UID: "59fa9917-a1d9-4554-afec-483df5e5f3b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.944424 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.944454 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:46 crc kubenswrapper[4744]: I0311 01:15:46.944464 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59fa9917-a1d9-4554-afec-483df5e5f3b5-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.275446 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.317572 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" event={"ID":"59fa9917-a1d9-4554-afec-483df5e5f3b5","Type":"ContainerDied","Data":"c3f81f7a7dbcc7a27db6fdcfc1aaa89489bd7961e99520066f795f686845ff29"} Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.317629 4744 scope.go:117] "RemoveContainer" containerID="8d47681840034b94af37e8bc246fa03461bdac0595eaf76fe0a850d6cd0b0f78" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.317809 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-vrk9j" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.320421 4744 generic.go:334] "Generic (PLEG): container finished" podID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerID="50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e" exitCode=0 Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.320488 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" event={"ID":"5f47e6da-0f4e-4b87-b21d-5d0b7adef080","Type":"ContainerDied","Data":"50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e"} Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.320527 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" event={"ID":"5f47e6da-0f4e-4b87-b21d-5d0b7adef080","Type":"ContainerStarted","Data":"dee2546bd882ee57122855c388019099ea7163d10368e479de2812d9ff00ed6b"} Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.346836 4744 generic.go:334] "Generic (PLEG): container finished" podID="3bd4d017-1f55-468a-8a09-472f77929440" containerID="6ff96f542e6deaab6c86c38fdd5ffdd2c4bf502dce96c3bc2a3aef59dde013cd" exitCode=0 Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.348029 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" event={"ID":"3bd4d017-1f55-468a-8a09-472f77929440","Type":"ContainerDied","Data":"6ff96f542e6deaab6c86c38fdd5ffdd2c4bf502dce96c3bc2a3aef59dde013cd"} Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.449830 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.480583 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-vrk9j"] Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.705303 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.764525 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.874784 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.874835 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v57c2\" (UniqueName: \"kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.874932 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.874965 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.875019 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.875099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0\") pod \"3bd4d017-1f55-468a-8a09-472f77929440\" (UID: \"3bd4d017-1f55-468a-8a09-472f77929440\") " Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.939368 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2" (OuterVolumeSpecName: "kube-api-access-v57c2") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "kube-api-access-v57c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.943241 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config" (OuterVolumeSpecName: "config") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.949036 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.951359 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.972957 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:47 crc kubenswrapper[4744]: I0311 01:15:47.978093 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.010799 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v57c2\" (UniqueName: \"kubernetes.io/projected/3bd4d017-1f55-468a-8a09-472f77929440-kube-api-access-v57c2\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.020093 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.020323 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.020433 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.017827 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22caad75-2e3e-4fbc-8358-7bab36543b23" path="/var/lib/kubelet/pods/22caad75-2e3e-4fbc-8358-7bab36543b23/volumes" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.036187 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fa9917-a1d9-4554-afec-483df5e5f3b5" path="/var/lib/kubelet/pods/59fa9917-a1d9-4554-afec-483df5e5f3b5/volumes" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.044632 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bd4d017-1f55-468a-8a09-472f77929440" (UID: "3bd4d017-1f55-468a-8a09-472f77929440"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.109288 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.127750 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd4d017-1f55-468a-8a09-472f77929440-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.127797 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.138429 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.369731 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" event={"ID":"3bd4d017-1f55-468a-8a09-472f77929440","Type":"ContainerDied","Data":"4781bd965fb6ba0743daefc46730cf87a183e95beb94d62eab7681be8ba9eb11"} Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.369783 4744 scope.go:117] "RemoveContainer" containerID="6ff96f542e6deaab6c86c38fdd5ffdd2c4bf502dce96c3bc2a3aef59dde013cd" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.369945 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cdbd9bcc-vrldq" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.395819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerStarted","Data":"f97e7a390f540dd49b7b43a21228e7030508f148892e459ccf111da272adce6e"} Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.400072 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerStarted","Data":"6c5973b8ad454dec36dc362bec9ba4cabac4856b233cf21d8aff3da6b5435885"} Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.437146 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.438039 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" event={"ID":"5f47e6da-0f4e-4b87-b21d-5d0b7adef080","Type":"ContainerStarted","Data":"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc"} Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.438589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.461893 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86cdbd9bcc-vrldq"] Mar 11 01:15:48 crc kubenswrapper[4744]: I0311 01:15:48.480161 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" podStartSLOduration=3.480124709 podStartE2EDuration="3.480124709s" podCreationTimestamp="2026-03-11 01:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:48.456644293 +0000 UTC m=+1305.260861888" watchObservedRunningTime="2026-03-11 01:15:48.480124709 +0000 UTC m=+1305.284342314" Mar 11 01:15:49 crc kubenswrapper[4744]: I0311 01:15:49.471628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerStarted","Data":"3fb42db21ab2aef7b4de5b76f1f1ed9c7a131ff66cd32935df7bef61d58e9d53"} Mar 11 01:15:49 crc kubenswrapper[4744]: I0311 01:15:49.474501 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerStarted","Data":"b9a3760cab355f9540a39ca01c011f31d60e6acedf22fd404ab874897d439a74"} Mar 11 01:15:49 crc kubenswrapper[4744]: I0311 01:15:49.987855 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd4d017-1f55-468a-8a09-472f77929440" path="/var/lib/kubelet/pods/3bd4d017-1f55-468a-8a09-472f77929440/volumes" Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.486376 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerStarted","Data":"285810b5e7c963db6a3ea289da93e517576304e5c084c78448e34b175d61ad7d"} Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.486722 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-log" containerID="cri-o://b9a3760cab355f9540a39ca01c011f31d60e6acedf22fd404ab874897d439a74" gracePeriod=30 Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.486828 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-httpd" containerID="cri-o://285810b5e7c963db6a3ea289da93e517576304e5c084c78448e34b175d61ad7d" gracePeriod=30 Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.494250 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f8dfaaa-9a7f-4693-8f45-a4a84123b714" containerID="119366017deed13d5411b21ac7aa67584fa8eee5670f556b1938ccb084cb7a63" exitCode=0 Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.494297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9ddjj" event={"ID":"1f8dfaaa-9a7f-4693-8f45-a4a84123b714","Type":"ContainerDied","Data":"119366017deed13d5411b21ac7aa67584fa8eee5670f556b1938ccb084cb7a63"} Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.502255 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerStarted","Data":"bb9ba941dd6448435d8c9cb00be6afd2570385ce5413da6d36374354b9ad5f81"} Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.502410 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-log" containerID="cri-o://3fb42db21ab2aef7b4de5b76f1f1ed9c7a131ff66cd32935df7bef61d58e9d53" gracePeriod=30 Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.502623 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-httpd" containerID="cri-o://bb9ba941dd6448435d8c9cb00be6afd2570385ce5413da6d36374354b9ad5f81" gracePeriod=30 Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.522873 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.522852852 podStartE2EDuration="5.522852852s" podCreationTimestamp="2026-03-11 01:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:50.512852063 +0000 UTC m=+1307.317069668" watchObservedRunningTime="2026-03-11 01:15:50.522852852 +0000 UTC m=+1307.327070457" Mar 11 01:15:50 crc kubenswrapper[4744]: I0311 01:15:50.535186 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.535165083 podStartE2EDuration="5.535165083s" podCreationTimestamp="2026-03-11 01:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:15:50.530104426 +0000 UTC m=+1307.334322031" watchObservedRunningTime="2026-03-11 01:15:50.535165083 +0000 UTC m=+1307.339382688" Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.516192 4744 generic.go:334] "Generic (PLEG): container finished" podID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerID="bb9ba941dd6448435d8c9cb00be6afd2570385ce5413da6d36374354b9ad5f81" exitCode=0 Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.516402 4744 generic.go:334] "Generic (PLEG): container finished" podID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerID="3fb42db21ab2aef7b4de5b76f1f1ed9c7a131ff66cd32935df7bef61d58e9d53" exitCode=143 Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.516231 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerDied","Data":"bb9ba941dd6448435d8c9cb00be6afd2570385ce5413da6d36374354b9ad5f81"} Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.516478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerDied","Data":"3fb42db21ab2aef7b4de5b76f1f1ed9c7a131ff66cd32935df7bef61d58e9d53"} Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.518815 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ff4983e-6d27-42c0-994f-548501239701" containerID="285810b5e7c963db6a3ea289da93e517576304e5c084c78448e34b175d61ad7d" exitCode=0 Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.518856 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ff4983e-6d27-42c0-994f-548501239701" containerID="b9a3760cab355f9540a39ca01c011f31d60e6acedf22fd404ab874897d439a74" exitCode=143 Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.518871 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerDied","Data":"285810b5e7c963db6a3ea289da93e517576304e5c084c78448e34b175d61ad7d"} Mar 11 01:15:51 crc kubenswrapper[4744]: I0311 01:15:51.518899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerDied","Data":"b9a3760cab355f9540a39ca01c011f31d60e6acedf22fd404ab874897d439a74"} Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.022641 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175233 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175327 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175371 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175427 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfxkw\" (UniqueName: \"kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.175671 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data\") pod \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\" (UID: \"1f8dfaaa-9a7f-4693-8f45-a4a84123b714\") " Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.181974 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.184124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts" (OuterVolumeSpecName: "scripts") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.190157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw" (OuterVolumeSpecName: "kube-api-access-cfxkw") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "kube-api-access-cfxkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.204649 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.206367 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data" (OuterVolumeSpecName: "config-data") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.206765 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8dfaaa-9a7f-4693-8f45-a4a84123b714" (UID: "1f8dfaaa-9a7f-4693-8f45-a4a84123b714"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277604 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfxkw\" (UniqueName: \"kubernetes.io/projected/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-kube-api-access-cfxkw\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277640 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277650 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277658 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277667 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.277674 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f8dfaaa-9a7f-4693-8f45-a4a84123b714-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.555853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9ddjj" event={"ID":"1f8dfaaa-9a7f-4693-8f45-a4a84123b714","Type":"ContainerDied","Data":"073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8"} Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.555902 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="073b7eae6e4d206640e0a3f9609d5d66b6845f198b6ba4925f8fa3b0f15835d8" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.555950 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9ddjj" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.769146 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.828558 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:15:55 crc kubenswrapper[4744]: I0311 01:15:55.828959 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58df884995-rf6pm" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" containerID="cri-o://6277045b67820d0a1857d93e6e8e3ca7197495ee0c3cd0064b91a9e4d115b55b" gracePeriod=10 Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.109504 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9ddjj"] Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.114881 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9ddjj"] Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.216495 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p8czs"] Mar 11 01:15:56 crc kubenswrapper[4744]: E0311 01:15:56.216973 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dfaaa-9a7f-4693-8f45-a4a84123b714" containerName="keystone-bootstrap" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.216997 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dfaaa-9a7f-4693-8f45-a4a84123b714" containerName="keystone-bootstrap" Mar 11 01:15:56 crc kubenswrapper[4744]: E0311 01:15:56.217035 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd4d017-1f55-468a-8a09-472f77929440" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.217043 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd4d017-1f55-468a-8a09-472f77929440" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: E0311 01:15:56.217057 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fa9917-a1d9-4554-afec-483df5e5f3b5" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.217063 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fa9917-a1d9-4554-afec-483df5e5f3b5" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.217264 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dfaaa-9a7f-4693-8f45-a4a84123b714" containerName="keystone-bootstrap" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.217283 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd4d017-1f55-468a-8a09-472f77929440" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.217300 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fa9917-a1d9-4554-afec-483df5e5f3b5" containerName="init" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.218007 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.224639 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.225141 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r54bc" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.227629 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.232191 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.232666 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.233580 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p8czs"] Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297225 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstlv\" (UniqueName: \"kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297310 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297333 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.297425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.398630 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstlv\" (UniqueName: \"kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.398714 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.399185 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.399246 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.399272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.399297 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.404322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.406295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.411683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.419992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.420181 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.424913 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstlv\" (UniqueName: \"kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv\") pod \"keystone-bootstrap-p8czs\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.545105 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.568973 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerID="6277045b67820d0a1857d93e6e8e3ca7197495ee0c3cd0064b91a9e4d115b55b" exitCode=0 Mar 11 01:15:56 crc kubenswrapper[4744]: I0311 01:15:56.569016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerDied","Data":"6277045b67820d0a1857d93e6e8e3ca7197495ee0c3cd0064b91a9e4d115b55b"} Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.678168 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822646 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822694 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8gvf\" (UniqueName: \"kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822760 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.822941 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run\") pod \"3f75d297-4f23-4c27-8754-1685d2c195b2\" (UID: \"3f75d297-4f23-4c27-8754-1685d2c195b2\") " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.823110 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs" (OuterVolumeSpecName: "logs") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.823541 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.823635 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.828924 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts" (OuterVolumeSpecName: "scripts") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.830086 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf" (OuterVolumeSpecName: "kube-api-access-n8gvf") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "kube-api-access-n8gvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.830195 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.860266 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.879489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data" (OuterVolumeSpecName: "config-data") pod "3f75d297-4f23-4c27-8754-1685d2c195b2" (UID: "3f75d297-4f23-4c27-8754-1685d2c195b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924816 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f75d297-4f23-4c27-8754-1685d2c195b2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924844 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924873 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924883 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8gvf\" (UniqueName: \"kubernetes.io/projected/3f75d297-4f23-4c27-8754-1685d2c195b2-kube-api-access-n8gvf\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924895 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.924903 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f75d297-4f23-4c27-8754-1685d2c195b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.942868 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 11 01:15:57 crc kubenswrapper[4744]: I0311 01:15:57.985321 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8dfaaa-9a7f-4693-8f45-a4a84123b714" path="/var/lib/kubelet/pods/1f8dfaaa-9a7f-4693-8f45-a4a84123b714/volumes" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.026645 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.352937 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58df884995-rf6pm" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.587449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f75d297-4f23-4c27-8754-1685d2c195b2","Type":"ContainerDied","Data":"f97e7a390f540dd49b7b43a21228e7030508f148892e459ccf111da272adce6e"} Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.587508 4744 scope.go:117] "RemoveContainer" containerID="bb9ba941dd6448435d8c9cb00be6afd2570385ce5413da6d36374354b9ad5f81" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.587530 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.614072 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.628628 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.635788 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:58 crc kubenswrapper[4744]: E0311 01:15:58.636138 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-httpd" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.636149 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-httpd" Mar 11 01:15:58 crc kubenswrapper[4744]: E0311 01:15:58.636171 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-log" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.636176 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-log" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.636321 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-log" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.636334 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" containerName="glance-httpd" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.637162 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.640849 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.641236 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.647347 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739303 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739377 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739400 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739420 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p272k\" (UniqueName: \"kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739643 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.739660 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840422 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840479 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840507 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840624 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p272k\" (UniqueName: \"kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.840919 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.841984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.842032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.845303 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.845788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.847429 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.848163 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.861644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p272k\" (UniqueName: \"kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.868542 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:15:58 crc kubenswrapper[4744]: I0311 01:15:58.960121 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:15:59 crc kubenswrapper[4744]: I0311 01:15:59.987452 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f75d297-4f23-4c27-8754-1685d2c195b2" path="/var/lib/kubelet/pods/3f75d297-4f23-4c27-8754-1685d2c195b2/volumes" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.147657 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553196-9jct5"] Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.148714 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.150303 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.150687 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.150886 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.157127 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553196-9jct5"] Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.271853 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgrt\" (UniqueName: \"kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt\") pod \"auto-csr-approver-29553196-9jct5\" (UID: \"0ddd91ff-2bab-458e-b371-13bb59892f28\") " pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.373892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twgrt\" (UniqueName: \"kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt\") pod \"auto-csr-approver-29553196-9jct5\" (UID: \"0ddd91ff-2bab-458e-b371-13bb59892f28\") " pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.393457 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twgrt\" (UniqueName: \"kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt\") pod \"auto-csr-approver-29553196-9jct5\" (UID: \"0ddd91ff-2bab-458e-b371-13bb59892f28\") " pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:00 crc kubenswrapper[4744]: I0311 01:16:00.480761 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:03 crc kubenswrapper[4744]: I0311 01:16:03.352468 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58df884995-rf6pm" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.025865 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175272 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175341 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r56gx\" (UniqueName: \"kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175496 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175657 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175711 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.175782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run\") pod \"1ff4983e-6d27-42c0-994f-548501239701\" (UID: \"1ff4983e-6d27-42c0-994f-548501239701\") " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.176257 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs" (OuterVolumeSpecName: "logs") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.176662 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.177319 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.181621 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.182743 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts" (OuterVolumeSpecName: "scripts") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.183911 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx" (OuterVolumeSpecName: "kube-api-access-r56gx") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "kube-api-access-r56gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.200891 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.229125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data" (OuterVolumeSpecName: "config-data") pod "1ff4983e-6d27-42c0-994f-548501239701" (UID: "1ff4983e-6d27-42c0-994f-548501239701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278127 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278157 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278167 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r56gx\" (UniqueName: \"kubernetes.io/projected/1ff4983e-6d27-42c0-994f-548501239701-kube-api-access-r56gx\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278197 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278207 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4983e-6d27-42c0-994f-548501239701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.278216 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ff4983e-6d27-42c0-994f-548501239701-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.297321 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.379860 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.486092 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.486237 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jdcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-vkz2r_openstack(64205e5d-2853-49f7-9928-8362fc9210ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.487919 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-vkz2r" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.653484 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ff4983e-6d27-42c0-994f-548501239701","Type":"ContainerDied","Data":"6c5973b8ad454dec36dc362bec9ba4cabac4856b233cf21d8aff3da6b5435885"} Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.653547 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.655204 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-vkz2r" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.701430 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.713406 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.728350 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.728722 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-log" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.728738 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-log" Mar 11 01:16:05 crc kubenswrapper[4744]: E0311 01:16:05.728748 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-httpd" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.728755 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-httpd" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.728928 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-log" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.728936 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff4983e-6d27-42c0-994f-548501239701" containerName="glance-httpd" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.729756 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.732551 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.733906 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.753558 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887436 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887581 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887622 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbs9b\" (UniqueName: \"kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887641 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.887658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.987415 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff4983e-6d27-42c0-994f-548501239701" path="/var/lib/kubelet/pods/1ff4983e-6d27-42c0-994f-548501239701/volumes" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.988902 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.988969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.988993 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989073 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989099 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbs9b\" (UniqueName: \"kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989125 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989501 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989798 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.989846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.995429 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:05 crc kubenswrapper[4744]: I0311 01:16:05.997415 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.008225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.011990 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.013267 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbs9b\" (UniqueName: \"kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.016086 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.064337 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:16:06 crc kubenswrapper[4744]: E0311 01:16:06.650736 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 11 01:16:06 crc kubenswrapper[4744]: E0311 01:16:06.651276 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6gc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-hqbzq_openstack(5df37d98-3dbc-4977-add0-525bda3d679b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 01:16:06 crc kubenswrapper[4744]: E0311 01:16:06.652610 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-hqbzq" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.669854 4744 scope.go:117] "RemoveContainer" containerID="3fb42db21ab2aef7b4de5b76f1f1ed9c7a131ff66cd32935df7bef61d58e9d53" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.675993 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-rf6pm" event={"ID":"8d727dc8-84d3-45b0-90e8-22a1f3f043e1","Type":"ContainerDied","Data":"59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23"} Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.676053 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f86369282f5653ab25777de78d61824880c7dc557bbd24cfafc87878c95c23" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.678313 4744 generic.go:334] "Generic (PLEG): container finished" podID="b09baca5-8198-404f-8b8d-8f58db34f975" containerID="fb286387721bec0a1f91359d2c30c79ca49f4b4898995988b7e9e90f9a29caf6" exitCode=0 Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.678343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g9cnb" event={"ID":"b09baca5-8198-404f-8b8d-8f58db34f975","Type":"ContainerDied","Data":"fb286387721bec0a1f91359d2c30c79ca49f4b4898995988b7e9e90f9a29caf6"} Mar 11 01:16:06 crc kubenswrapper[4744]: E0311 01:16:06.690019 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-hqbzq" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.771966 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.839258 4744 scope.go:117] "RemoveContainer" containerID="285810b5e7c963db6a3ea289da93e517576304e5c084c78448e34b175d61ad7d" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.911568 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb\") pod \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.911638 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config\") pod \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.911699 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc\") pod \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.911795 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcgb9\" (UniqueName: \"kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9\") pod \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.911836 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb\") pod \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\" (UID: \"8d727dc8-84d3-45b0-90e8-22a1f3f043e1\") " Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.924679 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9" (OuterVolumeSpecName: "kube-api-access-mcgb9") pod "8d727dc8-84d3-45b0-90e8-22a1f3f043e1" (UID: "8d727dc8-84d3-45b0-90e8-22a1f3f043e1"). InnerVolumeSpecName "kube-api-access-mcgb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.929891 4744 scope.go:117] "RemoveContainer" containerID="b9a3760cab355f9540a39ca01c011f31d60e6acedf22fd404ab874897d439a74" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.977404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d727dc8-84d3-45b0-90e8-22a1f3f043e1" (UID: "8d727dc8-84d3-45b0-90e8-22a1f3f043e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.984101 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d727dc8-84d3-45b0-90e8-22a1f3f043e1" (UID: "8d727dc8-84d3-45b0-90e8-22a1f3f043e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.990606 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config" (OuterVolumeSpecName: "config") pod "8d727dc8-84d3-45b0-90e8-22a1f3f043e1" (UID: "8d727dc8-84d3-45b0-90e8-22a1f3f043e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:06 crc kubenswrapper[4744]: I0311 01:16:06.993589 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d727dc8-84d3-45b0-90e8-22a1f3f043e1" (UID: "8d727dc8-84d3-45b0-90e8-22a1f3f043e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.014765 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcgb9\" (UniqueName: \"kubernetes.io/projected/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-kube-api-access-mcgb9\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.014810 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.014826 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.014840 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.014852 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d727dc8-84d3-45b0-90e8-22a1f3f043e1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:07 crc kubenswrapper[4744]: W0311 01:16:07.204670 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30905e0e_95fa_4d7c_b586_f02ef591dc1d.slice/crio-1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989 WatchSource:0}: Error finding container 1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989: Status 404 returned error can't find the container with id 1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989 Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.210759 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.213986 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p8czs"] Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.286764 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553196-9jct5"] Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.381761 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:16:07 crc kubenswrapper[4744]: W0311 01:16:07.385263 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b84adb2_ba84_4796_b8c6_3bf51e850b3f.slice/crio-bf6c891570ec1c484bd853f4d92849d190a304a07d71269d825c24136203ac8a WatchSource:0}: Error finding container bf6c891570ec1c484bd853f4d92849d190a304a07d71269d825c24136203ac8a: Status 404 returned error can't find the container with id bf6c891570ec1c484bd853f4d92849d190a304a07d71269d825c24136203ac8a Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.484346 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:16:07 crc kubenswrapper[4744]: W0311 01:16:07.484591 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a87446a_b7cc_4068_91e2_b5dbbc3cda71.slice/crio-e6983e3cb72c0c0a687dce9ef014fce5474ff36947b30a6ca0a0535243d6e751 WatchSource:0}: Error finding container e6983e3cb72c0c0a687dce9ef014fce5474ff36947b30a6ca0a0535243d6e751: Status 404 returned error can't find the container with id e6983e3cb72c0c0a687dce9ef014fce5474ff36947b30a6ca0a0535243d6e751 Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.756183 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerStarted","Data":"bf6c891570ec1c484bd853f4d92849d190a304a07d71269d825c24136203ac8a"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.759000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553196-9jct5" event={"ID":"0ddd91ff-2bab-458e-b371-13bb59892f28","Type":"ContainerStarted","Data":"5e65293b6353d361f8246297fb20dc807f8d56e95edc02983019bbe79d4306b6"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.761856 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qkr98" event={"ID":"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8","Type":"ContainerStarted","Data":"d9a6597b777989fed50af9702b46118834e4a564467d8179969dc4edc482ae74"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.769917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerStarted","Data":"e6983e3cb72c0c0a687dce9ef014fce5474ff36947b30a6ca0a0535243d6e751"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.774226 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerStarted","Data":"8b14126910c00367d3da794d22855618e3bf024fd2f95595dca6f23a35ae6f53"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.777147 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8czs" event={"ID":"30905e0e-95fa-4d7c-b586-f02ef591dc1d","Type":"ContainerStarted","Data":"fee28609ee0ea1a337fd4c5d5e10b1778640610d9c12fde1c9fb6d378bc3dee1"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.777197 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8czs" event={"ID":"30905e0e-95fa-4d7c-b586-f02ef591dc1d","Type":"ContainerStarted","Data":"1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989"} Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.778660 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-rf6pm" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.782993 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-qkr98" podStartSLOduration=4.064094577 podStartE2EDuration="23.782969757s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="2026-03-11 01:15:45.750710926 +0000 UTC m=+1302.554928531" lastFinishedPulling="2026-03-11 01:16:05.469586106 +0000 UTC m=+1322.273803711" observedRunningTime="2026-03-11 01:16:07.781645996 +0000 UTC m=+1324.585863611" watchObservedRunningTime="2026-03-11 01:16:07.782969757 +0000 UTC m=+1324.587187372" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.810195 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p8czs" podStartSLOduration=11.810171058 podStartE2EDuration="11.810171058s" podCreationTimestamp="2026-03-11 01:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:07.801069746 +0000 UTC m=+1324.605287371" watchObservedRunningTime="2026-03-11 01:16:07.810171058 +0000 UTC m=+1324.614388683" Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.837941 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.845419 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58df884995-rf6pm"] Mar 11 01:16:07 crc kubenswrapper[4744]: I0311 01:16:07.994261 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" path="/var/lib/kubelet/pods/8d727dc8-84d3-45b0-90e8-22a1f3f043e1/volumes" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.109963 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.235968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config\") pod \"b09baca5-8198-404f-8b8d-8f58db34f975\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.236071 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle\") pod \"b09baca5-8198-404f-8b8d-8f58db34f975\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.236099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr9gk\" (UniqueName: \"kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk\") pod \"b09baca5-8198-404f-8b8d-8f58db34f975\" (UID: \"b09baca5-8198-404f-8b8d-8f58db34f975\") " Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.240254 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk" (OuterVolumeSpecName: "kube-api-access-lr9gk") pod "b09baca5-8198-404f-8b8d-8f58db34f975" (UID: "b09baca5-8198-404f-8b8d-8f58db34f975"). InnerVolumeSpecName "kube-api-access-lr9gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.264524 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b09baca5-8198-404f-8b8d-8f58db34f975" (UID: "b09baca5-8198-404f-8b8d-8f58db34f975"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.266401 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config" (OuterVolumeSpecName: "config") pod "b09baca5-8198-404f-8b8d-8f58db34f975" (UID: "b09baca5-8198-404f-8b8d-8f58db34f975"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.338248 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.338554 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09baca5-8198-404f-8b8d-8f58db34f975-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.338581 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr9gk\" (UniqueName: \"kubernetes.io/projected/b09baca5-8198-404f-8b8d-8f58db34f975-kube-api-access-lr9gk\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.792591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g9cnb" event={"ID":"b09baca5-8198-404f-8b8d-8f58db34f975","Type":"ContainerDied","Data":"b2229f531a05351ec8ef5d94ef9c68a0310b77480e11401093d638e4a8d74c27"} Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.792624 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2229f531a05351ec8ef5d94ef9c68a0310b77480e11401093d638e4a8d74c27" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.792684 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g9cnb" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.795506 4744 generic.go:334] "Generic (PLEG): container finished" podID="93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" containerID="d9a6597b777989fed50af9702b46118834e4a564467d8179969dc4edc482ae74" exitCode=0 Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.795600 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qkr98" event={"ID":"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8","Type":"ContainerDied","Data":"d9a6597b777989fed50af9702b46118834e4a564467d8179969dc4edc482ae74"} Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.804774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerStarted","Data":"d81d67c0c0b038d838a1c58d63d77cb43337c0b25c6edd1ee93578a354f525fb"} Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.815489 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerStarted","Data":"60cba145747ef116205a3e302ad50bb5d4b00de1456716486451fbb9e79a9eb5"} Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.815575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerStarted","Data":"8fbde8a2a9b6059d4fbff5c56a5a963290f4efef4a41a866b702d93ee81bb0ff"} Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.849144 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.849123878 podStartE2EDuration="10.849123878s" podCreationTimestamp="2026-03-11 01:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:08.839378457 +0000 UTC m=+1325.643596082" watchObservedRunningTime="2026-03-11 01:16:08.849123878 +0000 UTC m=+1325.653341493" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.961142 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.961574 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.981891 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:08 crc kubenswrapper[4744]: E0311 01:16:08.982334 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.982347 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" Mar 11 01:16:08 crc kubenswrapper[4744]: E0311 01:16:08.982367 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09baca5-8198-404f-8b8d-8f58db34f975" containerName="neutron-db-sync" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.982373 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09baca5-8198-404f-8b8d-8f58db34f975" containerName="neutron-db-sync" Mar 11 01:16:08 crc kubenswrapper[4744]: E0311 01:16:08.982395 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="init" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.982402 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="init" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.982562 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09baca5-8198-404f-8b8d-8f58db34f975" containerName="neutron-db-sync" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.982593 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d727dc8-84d3-45b0-90e8-22a1f3f043e1" containerName="dnsmasq-dns" Mar 11 01:16:08 crc kubenswrapper[4744]: I0311 01:16:08.983687 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.004915 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.055853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056554 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056638 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z59nv\" (UniqueName: \"kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.056691 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.063736 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.065722 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.068588 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.068966 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mhb2h" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.070080 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.070763 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.075307 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.104180 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.158489 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czhps\" (UniqueName: \"kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.158824 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.158843 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.159693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.162804 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.162929 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.162968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163022 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163131 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z59nv\" (UniqueName: \"kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163213 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163281 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.163305 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.164441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.165017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.190759 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z59nv\" (UniqueName: \"kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv\") pod \"dnsmasq-dns-5864dc4585-mznwk\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.264692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.264765 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.264787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.264842 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czhps\" (UniqueName: \"kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.264878 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.269528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.284334 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.285565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.286550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.288575 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czhps\" (UniqueName: \"kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps\") pod \"neutron-5568fddbb8-2fn4w\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.322311 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.389850 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.832979 4744 generic.go:334] "Generic (PLEG): container finished" podID="0ddd91ff-2bab-458e-b371-13bb59892f28" containerID="d1726b9d6f03c03c356e9d6bb32bfd5fbcb952342b378d0b48932e5022673713" exitCode=0 Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.833141 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553196-9jct5" event={"ID":"0ddd91ff-2bab-458e-b371-13bb59892f28","Type":"ContainerDied","Data":"d1726b9d6f03c03c356e9d6bb32bfd5fbcb952342b378d0b48932e5022673713"} Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.839164 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerStarted","Data":"e39a4e1e73d4df93b17475b242b8932a88bcad5ab937ba78200e62915ecf60bd"} Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.839464 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.839528 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:09 crc kubenswrapper[4744]: I0311 01:16:09.916639 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.098967 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.600580 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qkr98" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.691637 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsk54\" (UniqueName: \"kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54\") pod \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.692222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle\") pod \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.692481 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts\") pod \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.692722 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs\") pod \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.692783 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data\") pod \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\" (UID: \"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8\") " Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.695234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs" (OuterVolumeSpecName: "logs") pod "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" (UID: "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.704304 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54" (OuterVolumeSpecName: "kube-api-access-zsk54") pod "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" (UID: "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8"). InnerVolumeSpecName "kube-api-access-zsk54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.705390 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts" (OuterVolumeSpecName: "scripts") pod "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" (UID: "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.778085 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" (UID: "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.783595 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data" (OuterVolumeSpecName: "config-data") pod "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" (UID: "93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.795098 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.795133 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.795145 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.795161 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsk54\" (UniqueName: \"kubernetes.io/projected/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-kube-api-access-zsk54\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.795174 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.855394 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerStarted","Data":"c34020c1c08a3e59f5915d2c9c5efef80c6c178d17557caed339439d1ac2e9f9"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.861850 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qkr98" event={"ID":"93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8","Type":"ContainerDied","Data":"1dc27ff56af04f555f2294d07d8c0bb0f029e94419e3554c55f375bd61a9c0eb"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.861892 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dc27ff56af04f555f2294d07d8c0bb0f029e94419e3554c55f375bd61a9c0eb" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.861948 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qkr98" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.864004 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerStarted","Data":"7b8d25281984fbd8762e523200998295e6a8e82c51a73cb0a34304444754f847"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.864090 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerStarted","Data":"f724975883cb9d897dce411c8f17e1c7380b82aa102e103367e27d2bea058e3a"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.867638 4744 generic.go:334] "Generic (PLEG): container finished" podID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerID="87ce7c98f7b870db615d895adf8ea71c02d6b72c27928c9990d36c42e05a242b" exitCode=0 Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.872287 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" event={"ID":"909cbc8d-726a-427a-9233-c5c3ea5387f0","Type":"ContainerDied","Data":"87ce7c98f7b870db615d895adf8ea71c02d6b72c27928c9990d36c42e05a242b"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.872327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" event={"ID":"909cbc8d-726a-427a-9233-c5c3ea5387f0","Type":"ContainerStarted","Data":"155affd04de146537333c3fb8bbda31378a7034a7ab95b80e6ca56be28dcd98a"} Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.899308 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:16:10 crc kubenswrapper[4744]: E0311 01:16:10.900391 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" containerName="placement-db-sync" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.900947 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" containerName="placement-db-sync" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.901245 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" containerName="placement-db-sync" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.907578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.910663 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.911490 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.911791 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-td6zl" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.911979 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.935941 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.957573 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:16:10 crc kubenswrapper[4744]: I0311 01:16:10.958359 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.958348437 podStartE2EDuration="5.958348437s" podCreationTimestamp="2026-03-11 01:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:10.943010613 +0000 UTC m=+1327.747228218" watchObservedRunningTime="2026-03-11 01:16:10.958348437 +0000 UTC m=+1327.762566042" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101552 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101581 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101743 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsvr\" (UniqueName: \"kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.101761 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.204406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.204787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsvr\" (UniqueName: \"kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.204832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.204853 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.204993 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.205018 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.205094 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.207071 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.218773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.221051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.228545 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.233284 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.233924 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsvr\" (UniqueName: \"kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.235452 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle\") pod \"placement-55fb7645db-dh4kb\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.274293 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.364350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.407343 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twgrt\" (UniqueName: \"kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt\") pod \"0ddd91ff-2bab-458e-b371-13bb59892f28\" (UID: \"0ddd91ff-2bab-458e-b371-13bb59892f28\") " Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.413422 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt" (OuterVolumeSpecName: "kube-api-access-twgrt") pod "0ddd91ff-2bab-458e-b371-13bb59892f28" (UID: "0ddd91ff-2bab-458e-b371-13bb59892f28"). InnerVolumeSpecName "kube-api-access-twgrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.509173 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twgrt\" (UniqueName: \"kubernetes.io/projected/0ddd91ff-2bab-458e-b371-13bb59892f28-kube-api-access-twgrt\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.859126 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.885740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerStarted","Data":"1a1f227b90e4b8581f9aee4646f1053a56e9395e2bb663593d46ed7803fc986d"} Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.886426 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.889584 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" event={"ID":"909cbc8d-726a-427a-9233-c5c3ea5387f0","Type":"ContainerStarted","Data":"771097dfdfd9beb6cb610bfc57db5e2fe0f8386f0ff8e2f93f74df2d8bfdf38c"} Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.889724 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.892333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerStarted","Data":"52433837c70b6c098da56e50528d55dbb5db2a0c92a88f0a940dfe0f071bd418"} Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.893254 4744 generic.go:334] "Generic (PLEG): container finished" podID="30905e0e-95fa-4d7c-b586-f02ef591dc1d" containerID="fee28609ee0ea1a337fd4c5d5e10b1778640610d9c12fde1c9fb6d378bc3dee1" exitCode=0 Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.893292 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8czs" event={"ID":"30905e0e-95fa-4d7c-b586-f02ef591dc1d","Type":"ContainerDied","Data":"fee28609ee0ea1a337fd4c5d5e10b1778640610d9c12fde1c9fb6d378bc3dee1"} Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.894427 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553196-9jct5" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.894713 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553196-9jct5" event={"ID":"0ddd91ff-2bab-458e-b371-13bb59892f28","Type":"ContainerDied","Data":"5e65293b6353d361f8246297fb20dc807f8d56e95edc02983019bbe79d4306b6"} Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.894735 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e65293b6353d361f8246297fb20dc807f8d56e95edc02983019bbe79d4306b6" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.910209 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5568fddbb8-2fn4w" podStartSLOduration=2.910193304 podStartE2EDuration="2.910193304s" podCreationTimestamp="2026-03-11 01:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:11.907021407 +0000 UTC m=+1328.711239012" watchObservedRunningTime="2026-03-11 01:16:11.910193304 +0000 UTC m=+1328.714410909" Mar 11 01:16:11 crc kubenswrapper[4744]: I0311 01:16:11.957824 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" podStartSLOduration=3.957805526 podStartE2EDuration="3.957805526s" podCreationTimestamp="2026-03-11 01:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:11.953846364 +0000 UTC m=+1328.758063969" watchObservedRunningTime="2026-03-11 01:16:11.957805526 +0000 UTC m=+1328.762023131" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.378575 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553190-9nkn2"] Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.384369 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553190-9nkn2"] Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.394178 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:12 crc kubenswrapper[4744]: E0311 01:16:12.394543 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddd91ff-2bab-458e-b371-13bb59892f28" containerName="oc" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.394563 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddd91ff-2bab-458e-b371-13bb59892f28" containerName="oc" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.394758 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddd91ff-2bab-458e-b371-13bb59892f28" containerName="oc" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.402711 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.431677 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.431755 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.432104 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.432363 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.436774 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.450876 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtsh\" (UniqueName: \"kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451000 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451031 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451097 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451198 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451219 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.451240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552403 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552566 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552580 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.552609 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtsh\" (UniqueName: \"kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.562371 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.566386 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.566366 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.569223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.571222 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.578556 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.586978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtsh\" (UniqueName: \"kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh\") pod \"neutron-77cdc8f7b7-2kx6j\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.775693 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.918097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerStarted","Data":"0a4949e41ac540c50bbdd22a80ed256ec6d9d7caf5e686eda3841b02bb821d2c"} Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.918451 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerStarted","Data":"8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a"} Mar 11 01:16:12 crc kubenswrapper[4744]: I0311 01:16:12.941350 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55fb7645db-dh4kb" podStartSLOduration=2.941335233 podStartE2EDuration="2.941335233s" podCreationTimestamp="2026-03-11 01:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:12.937952548 +0000 UTC m=+1329.742170153" watchObservedRunningTime="2026-03-11 01:16:12.941335233 +0000 UTC m=+1329.745552838" Mar 11 01:16:13 crc kubenswrapper[4744]: I0311 01:16:13.929777 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:13 crc kubenswrapper[4744]: I0311 01:16:13.930226 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:13 crc kubenswrapper[4744]: I0311 01:16:13.985058 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e9671f-6963-4873-8357-2580e9b768f0" path="/var/lib/kubelet/pods/60e9671f-6963-4873-8357-2580e9b768f0/volumes" Mar 11 01:16:14 crc kubenswrapper[4744]: I0311 01:16:14.052946 4744 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod810da0cb-5013-4997-84ba-4437bce2a20d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod810da0cb-5013-4997-84ba-4437bce2a20d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod810da0cb_5013_4997_84ba_4437bce2a20d.slice" Mar 11 01:16:14 crc kubenswrapper[4744]: E0311 01:16:14.053002 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod810da0cb-5013-4997-84ba-4437bce2a20d] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod810da0cb-5013-4997-84ba-4437bce2a20d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod810da0cb_5013_4997_84ba_4437bce2a20d.slice" pod="openstack/keystone-db-sync-r8hsb" podUID="810da0cb-5013-4997-84ba-4437bce2a20d" Mar 11 01:16:14 crc kubenswrapper[4744]: I0311 01:16:14.947360 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r8hsb" Mar 11 01:16:14 crc kubenswrapper[4744]: I0311 01:16:14.947579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8czs" event={"ID":"30905e0e-95fa-4d7c-b586-f02ef591dc1d","Type":"ContainerDied","Data":"1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989"} Mar 11 01:16:14 crc kubenswrapper[4744]: I0311 01:16:14.947756 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b477ce5549370728aa8eb65edc1a76908dab8ce1b4a0400480d246b472a9989" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.066378 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214424 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214532 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214694 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstlv\" (UniqueName: \"kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214841 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.214866 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys\") pod \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\" (UID: \"30905e0e-95fa-4d7c-b586-f02ef591dc1d\") " Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.223291 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts" (OuterVolumeSpecName: "scripts") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.223901 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.236482 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv" (OuterVolumeSpecName: "kube-api-access-jstlv") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "kube-api-access-jstlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.237911 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.263745 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data" (OuterVolumeSpecName: "config-data") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.266159 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30905e0e-95fa-4d7c-b586-f02ef591dc1d" (UID: "30905e0e-95fa-4d7c-b586-f02ef591dc1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.284908 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317000 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstlv\" (UniqueName: \"kubernetes.io/projected/30905e0e-95fa-4d7c-b586-f02ef591dc1d-kube-api-access-jstlv\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317027 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317036 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317045 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317057 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.317068 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30905e0e-95fa-4d7c-b586-f02ef591dc1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.957995 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerStarted","Data":"d9359028d08e29da98bd43eaa9ae47f57140e8a5ed0ffb677e05f84587705c2d"} Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.961924 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8czs" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.961964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerStarted","Data":"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411"} Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.961992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerStarted","Data":"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9"} Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.962001 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerStarted","Data":"630a474957df816c074be1456412a7125e8edd61e818cd9a41b2148ac79e6da3"} Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.962045 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:15 crc kubenswrapper[4744]: I0311 01:16:15.988909 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77cdc8f7b7-2kx6j" podStartSLOduration=3.988896293 podStartE2EDuration="3.988896293s" podCreationTimestamp="2026-03-11 01:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:15.9859202 +0000 UTC m=+1332.790137805" watchObservedRunningTime="2026-03-11 01:16:15.988896293 +0000 UTC m=+1332.793113898" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.066259 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.066318 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.112073 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.120936 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.178178 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:16:16 crc kubenswrapper[4744]: E0311 01:16:16.178685 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30905e0e-95fa-4d7c-b586-f02ef591dc1d" containerName="keystone-bootstrap" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.178707 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="30905e0e-95fa-4d7c-b586-f02ef591dc1d" containerName="keystone-bootstrap" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.179002 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="30905e0e-95fa-4d7c-b586-f02ef591dc1d" containerName="keystone-bootstrap" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.179716 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.182412 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.182415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.184023 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.184323 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r54bc" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.184694 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.186301 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.198199 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335730 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335783 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335851 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75lzm\" (UniqueName: \"kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335896 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335936 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.335967 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.437863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.438946 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439315 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439599 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.439836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75lzm\" (UniqueName: \"kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.443434 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.444047 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.444560 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.444976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.448158 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.452336 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.453857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.454691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75lzm\" (UniqueName: \"kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm\") pod \"keystone-7cc5cb746-kmb4g\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.496991 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.958103 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:16:16 crc kubenswrapper[4744]: W0311 01:16:16.960612 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee4fd0f_1e90_4771_bca2_2eb17df0b0b1.slice/crio-53604dfa6ef728013e5531a56d6c6b91cb1becd895524058d82c2d5522b089d0 WatchSource:0}: Error finding container 53604dfa6ef728013e5531a56d6c6b91cb1becd895524058d82c2d5522b089d0: Status 404 returned error can't find the container with id 53604dfa6ef728013e5531a56d6c6b91cb1becd895524058d82c2d5522b089d0 Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.977588 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cc5cb746-kmb4g" event={"ID":"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1","Type":"ContainerStarted","Data":"53604dfa6ef728013e5531a56d6c6b91cb1becd895524058d82c2d5522b089d0"} Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.978280 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 01:16:16 crc kubenswrapper[4744]: I0311 01:16:16.978427 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 01:16:18 crc kubenswrapper[4744]: I0311 01:16:18.003979 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cc5cb746-kmb4g" event={"ID":"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1","Type":"ContainerStarted","Data":"a5f82d457fa59f44d576003aefd017ef2285b45e6c69abd14e7eb7f7df02fc09"} Mar 11 01:16:18 crc kubenswrapper[4744]: I0311 01:16:18.004332 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:18 crc kubenswrapper[4744]: I0311 01:16:18.030737 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cc5cb746-kmb4g" podStartSLOduration=2.030721747 podStartE2EDuration="2.030721747s" podCreationTimestamp="2026-03-11 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:18.02727377 +0000 UTC m=+1334.831491375" watchObservedRunningTime="2026-03-11 01:16:18.030721747 +0000 UTC m=+1334.834939352" Mar 11 01:16:18 crc kubenswrapper[4744]: I0311 01:16:18.921906 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.010562 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.112346 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.324705 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.382353 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.382706 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="dnsmasq-dns" containerID="cri-o://323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc" gracePeriod=10 Mar 11 01:16:19 crc kubenswrapper[4744]: I0311 01:16:19.887825 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011283 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011603 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011641 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011740 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011777 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.011803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7vwx\" (UniqueName: \"kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx\") pod \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\" (UID: \"5f47e6da-0f4e-4b87-b21d-5d0b7adef080\") " Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.017240 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx" (OuterVolumeSpecName: "kube-api-access-v7vwx") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "kube-api-access-v7vwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.022834 4744 generic.go:334] "Generic (PLEG): container finished" podID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerID="323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc" exitCode=0 Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.022966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" event={"ID":"5f47e6da-0f4e-4b87-b21d-5d0b7adef080","Type":"ContainerDied","Data":"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc"} Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.023043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" event={"ID":"5f47e6da-0f4e-4b87-b21d-5d0b7adef080","Type":"ContainerDied","Data":"dee2546bd882ee57122855c388019099ea7163d10368e479de2812d9ff00ed6b"} Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.023106 4744 scope.go:117] "RemoveContainer" containerID="323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.023272 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-pgkhm" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.029881 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vkz2r" event={"ID":"64205e5d-2853-49f7-9928-8362fc9210ea","Type":"ContainerStarted","Data":"146ff6ba72638f407c5f03e5e7f7737ad544dcacd98a41c6f2d121f5c65294e6"} Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.052036 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vkz2r" podStartSLOduration=2.633016523 podStartE2EDuration="36.052020248s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="2026-03-11 01:15:45.996599698 +0000 UTC m=+1302.800817293" lastFinishedPulling="2026-03-11 01:16:19.415603413 +0000 UTC m=+1336.219821018" observedRunningTime="2026-03-11 01:16:20.043855946 +0000 UTC m=+1336.848073541" watchObservedRunningTime="2026-03-11 01:16:20.052020248 +0000 UTC m=+1336.856237853" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.053602 4744 scope.go:117] "RemoveContainer" containerID="50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.064781 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config" (OuterVolumeSpecName: "config") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.073599 4744 scope.go:117] "RemoveContainer" containerID="323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc" Mar 11 01:16:20 crc kubenswrapper[4744]: E0311 01:16:20.074264 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc\": container with ID starting with 323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc not found: ID does not exist" containerID="323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.074295 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc"} err="failed to get container status \"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc\": rpc error: code = NotFound desc = could not find container \"323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc\": container with ID starting with 323f56b86f07d13644f5651fced8bc7e6ed8f3296cd03d759582c4cf8afc9bcc not found: ID does not exist" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.074322 4744 scope.go:117] "RemoveContainer" containerID="50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e" Mar 11 01:16:20 crc kubenswrapper[4744]: E0311 01:16:20.074774 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e\": container with ID starting with 50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e not found: ID does not exist" containerID="50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.074818 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e"} err="failed to get container status \"50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e\": rpc error: code = NotFound desc = could not find container \"50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e\": container with ID starting with 50a1eb1cd2fc2a8d73a6f158d2e6b6c133112c74e68b57860572f9d362d8190e not found: ID does not exist" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.075078 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.081249 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.085256 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.089604 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f47e6da-0f4e-4b87-b21d-5d0b7adef080" (UID: "5f47e6da-0f4e-4b87-b21d-5d0b7adef080"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114643 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114670 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114680 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114690 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114698 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.114708 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7vwx\" (UniqueName: \"kubernetes.io/projected/5f47e6da-0f4e-4b87-b21d-5d0b7adef080-kube-api-access-v7vwx\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.363588 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:16:20 crc kubenswrapper[4744]: I0311 01:16:20.369767 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-pgkhm"] Mar 11 01:16:22 crc kubenswrapper[4744]: I0311 01:16:22.002465 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" path="/var/lib/kubelet/pods/5f47e6da-0f4e-4b87-b21d-5d0b7adef080/volumes" Mar 11 01:16:23 crc kubenswrapper[4744]: I0311 01:16:23.057287 4744 generic.go:334] "Generic (PLEG): container finished" podID="64205e5d-2853-49f7-9928-8362fc9210ea" containerID="146ff6ba72638f407c5f03e5e7f7737ad544dcacd98a41c6f2d121f5c65294e6" exitCode=0 Mar 11 01:16:23 crc kubenswrapper[4744]: I0311 01:16:23.057344 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vkz2r" event={"ID":"64205e5d-2853-49f7-9928-8362fc9210ea","Type":"ContainerDied","Data":"146ff6ba72638f407c5f03e5e7f7737ad544dcacd98a41c6f2d121f5c65294e6"} Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.298681 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.422668 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data\") pod \"64205e5d-2853-49f7-9928-8362fc9210ea\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.422729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle\") pod \"64205e5d-2853-49f7-9928-8362fc9210ea\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.422937 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdcj\" (UniqueName: \"kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj\") pod \"64205e5d-2853-49f7-9928-8362fc9210ea\" (UID: \"64205e5d-2853-49f7-9928-8362fc9210ea\") " Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.426982 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "64205e5d-2853-49f7-9928-8362fc9210ea" (UID: "64205e5d-2853-49f7-9928-8362fc9210ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.427012 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj" (OuterVolumeSpecName: "kube-api-access-6jdcj") pod "64205e5d-2853-49f7-9928-8362fc9210ea" (UID: "64205e5d-2853-49f7-9928-8362fc9210ea"). InnerVolumeSpecName "kube-api-access-6jdcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.468896 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64205e5d-2853-49f7-9928-8362fc9210ea" (UID: "64205e5d-2853-49f7-9928-8362fc9210ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.525010 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdcj\" (UniqueName: \"kubernetes.io/projected/64205e5d-2853-49f7-9928-8362fc9210ea-kube-api-access-6jdcj\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.525039 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:25 crc kubenswrapper[4744]: I0311 01:16:25.525049 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64205e5d-2853-49f7-9928-8362fc9210ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.085492 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vkz2r" event={"ID":"64205e5d-2853-49f7-9928-8362fc9210ea","Type":"ContainerDied","Data":"66bcdf1c8eea36678580c89ede8345badf342926555db788ebe38be83b2b832b"} Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.085839 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66bcdf1c8eea36678580c89ede8345badf342926555db788ebe38be83b2b832b" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.085575 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vkz2r" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerStarted","Data":"0238cd388ba680a38261ff524effabf6eb3f5c23b4fa6ad76b42367e73b94275"} Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088760 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-central-agent" containerID="cri-o://8b14126910c00367d3da794d22855618e3bf024fd2f95595dca6f23a35ae6f53" gracePeriod=30 Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088820 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-notification-agent" containerID="cri-o://e39a4e1e73d4df93b17475b242b8932a88bcad5ab937ba78200e62915ecf60bd" gracePeriod=30 Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088836 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="sg-core" containerID="cri-o://d9359028d08e29da98bd43eaa9ae47f57140e8a5ed0ffb677e05f84587705c2d" gracePeriod=30 Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088881 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="proxy-httpd" containerID="cri-o://0238cd388ba680a38261ff524effabf6eb3f5c23b4fa6ad76b42367e73b94275" gracePeriod=30 Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.088920 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.092245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqbzq" event={"ID":"5df37d98-3dbc-4977-add0-525bda3d679b","Type":"ContainerStarted","Data":"0f9b7029580d55c5f783ab75363f97ad48322c0595bdb273cac92c6d226c63be"} Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.132178 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9361960959999998 podStartE2EDuration="42.132153312s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="2026-03-11 01:15:46.113838442 +0000 UTC m=+1302.918056047" lastFinishedPulling="2026-03-11 01:16:25.309795608 +0000 UTC m=+1342.114013263" observedRunningTime="2026-03-11 01:16:26.128358645 +0000 UTC m=+1342.932576260" watchObservedRunningTime="2026-03-11 01:16:26.132153312 +0000 UTC m=+1342.936370937" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.169923 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hqbzq" podStartSLOduration=2.6665758520000002 podStartE2EDuration="42.16990735s" podCreationTimestamp="2026-03-11 01:15:44 +0000 UTC" firstStartedPulling="2026-03-11 01:15:45.797390189 +0000 UTC m=+1302.601607794" lastFinishedPulling="2026-03-11 01:16:25.300721687 +0000 UTC m=+1342.104939292" observedRunningTime="2026-03-11 01:16:26.158400464 +0000 UTC m=+1342.962618069" watchObservedRunningTime="2026-03-11 01:16:26.16990735 +0000 UTC m=+1342.974124955" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.592546 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:16:26 crc kubenswrapper[4744]: E0311 01:16:26.592856 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" containerName="barbican-db-sync" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.592867 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" containerName="barbican-db-sync" Mar 11 01:16:26 crc kubenswrapper[4744]: E0311 01:16:26.592884 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="dnsmasq-dns" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.592890 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="dnsmasq-dns" Mar 11 01:16:26 crc kubenswrapper[4744]: E0311 01:16:26.592903 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="init" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.592908 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="init" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.593249 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f47e6da-0f4e-4b87-b21d-5d0b7adef080" containerName="dnsmasq-dns" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.593268 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" containerName="barbican-db-sync" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.594128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.596672 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.606019 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5ssj5" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.606796 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.613172 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.619263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.621017 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.650583 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.676274 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.692169 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.696792 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.731233 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771496 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771569 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp6cx\" (UniqueName: \"kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771661 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771688 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771743 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771766 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vvch\" (UniqueName: \"kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.771783 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.782355 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.783604 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.791828 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.827464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.875907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.875983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.876038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877131 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp6cx\" (UniqueName: \"kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877330 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877346 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877415 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877487 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.877965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878090 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vvch\" (UniqueName: \"kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878144 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878182 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdjs\" (UniqueName: \"kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878205 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878224 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfh5t\" (UniqueName: \"kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.878659 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.884645 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.889818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.890293 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.890706 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.893017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.893559 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.908138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vvch\" (UniqueName: \"kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch\") pod \"barbican-worker-bc7567ff7-gl658\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.915841 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.916832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp6cx\" (UniqueName: \"kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx\") pod \"barbican-keystone-listener-7647d7b844-j6gcn\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.935962 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979408 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979441 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979471 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979573 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979594 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979609 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdjs\" (UniqueName: \"kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979696 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.979709 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfh5t\" (UniqueName: \"kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.981282 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.981323 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.981364 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.985246 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.986038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:26 crc kubenswrapper[4744]: I0311 01:16:26.986279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.005111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.005155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfh5t\" (UniqueName: \"kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.007928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.011536 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle\") pod \"barbican-api-7bcd5444fb-98mrs\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.013500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdjs\" (UniqueName: \"kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs\") pod \"dnsmasq-dns-5b78c5c5d5-dzk6k\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.041804 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.124765 4744 generic.go:334] "Generic (PLEG): container finished" podID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerID="0238cd388ba680a38261ff524effabf6eb3f5c23b4fa6ad76b42367e73b94275" exitCode=0 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125053 4744 generic.go:334] "Generic (PLEG): container finished" podID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerID="d9359028d08e29da98bd43eaa9ae47f57140e8a5ed0ffb677e05f84587705c2d" exitCode=2 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.124960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerDied","Data":"0238cd388ba680a38261ff524effabf6eb3f5c23b4fa6ad76b42367e73b94275"} Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125122 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerDied","Data":"d9359028d08e29da98bd43eaa9ae47f57140e8a5ed0ffb677e05f84587705c2d"} Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerDied","Data":"e39a4e1e73d4df93b17475b242b8932a88bcad5ab937ba78200e62915ecf60bd"} Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125063 4744 generic.go:334] "Generic (PLEG): container finished" podID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerID="e39a4e1e73d4df93b17475b242b8932a88bcad5ab937ba78200e62915ecf60bd" exitCode=0 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125171 4744 generic.go:334] "Generic (PLEG): container finished" podID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerID="8b14126910c00367d3da794d22855618e3bf024fd2f95595dca6f23a35ae6f53" exitCode=0 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.125191 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerDied","Data":"8b14126910c00367d3da794d22855618e3bf024fd2f95595dca6f23a35ae6f53"} Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.176411 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.176739 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.285430 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmdx\" (UniqueName: \"kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.285796 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.285840 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.285859 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.285965 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.286147 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.286213 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts\") pod \"ddeba13d-7885-44ef-8454-1a7b6ef48303\" (UID: \"ddeba13d-7885-44ef-8454-1a7b6ef48303\") " Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.286760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.286822 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.291569 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx" (OuterVolumeSpecName: "kube-api-access-lqmdx") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "kube-api-access-lqmdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.293505 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts" (OuterVolumeSpecName: "scripts") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.359464 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.389481 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.389524 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmdx\" (UniqueName: \"kubernetes.io/projected/ddeba13d-7885-44ef-8454-1a7b6ef48303-kube-api-access-lqmdx\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.389535 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.389543 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.389551 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddeba13d-7885-44ef-8454-1a7b6ef48303-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.401092 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.443129 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data" (OuterVolumeSpecName: "config-data") pod "ddeba13d-7885-44ef-8454-1a7b6ef48303" (UID: "ddeba13d-7885-44ef-8454-1a7b6ef48303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.492487 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.492530 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeba13d-7885-44ef-8454-1a7b6ef48303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.495601 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:16:27 crc kubenswrapper[4744]: W0311 01:16:27.511449 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb8af9e_ef1e_45b0_b842_2647fe75510e.slice/crio-00cd22cfbd7690b23506a55b55085f3deb61172a61c332f09aad3e95bb5e8a18 WatchSource:0}: Error finding container 00cd22cfbd7690b23506a55b55085f3deb61172a61c332f09aad3e95bb5e8a18: Status 404 returned error can't find the container with id 00cd22cfbd7690b23506a55b55085f3deb61172a61c332f09aad3e95bb5e8a18 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.557718 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:16:27 crc kubenswrapper[4744]: W0311 01:16:27.563494 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df98dbd_473b_4630_81ab_edd6419feb0d.slice/crio-0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea WatchSource:0}: Error finding container 0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea: Status 404 returned error can't find the container with id 0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.670456 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:27 crc kubenswrapper[4744]: W0311 01:16:27.676138 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08522fac_9099_4781_95f9_6f676a290be9.slice/crio-c02f08b8b3659d48c981b4b479107e9ab360e41029f7b0a91796f624ce4e4c60 WatchSource:0}: Error finding container c02f08b8b3659d48c981b4b479107e9ab360e41029f7b0a91796f624ce4e4c60: Status 404 returned error can't find the container with id c02f08b8b3659d48c981b4b479107e9ab360e41029f7b0a91796f624ce4e4c60 Mar 11 01:16:27 crc kubenswrapper[4744]: I0311 01:16:27.778448 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.137899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerStarted","Data":"0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.139875 4744 generic.go:334] "Generic (PLEG): container finished" podID="08522fac-9099-4781-95f9-6f676a290be9" containerID="0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875" exitCode=0 Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.139946 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" event={"ID":"08522fac-9099-4781-95f9-6f676a290be9","Type":"ContainerDied","Data":"0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.139974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" event={"ID":"08522fac-9099-4781-95f9-6f676a290be9","Type":"ContainerStarted","Data":"c02f08b8b3659d48c981b4b479107e9ab360e41029f7b0a91796f624ce4e4c60"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.145369 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.145399 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddeba13d-7885-44ef-8454-1a7b6ef48303","Type":"ContainerDied","Data":"2b9835e17acd5d4021880cc0cc42522b679209520e6b88d73111082e68e4d77f"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.145453 4744 scope.go:117] "RemoveContainer" containerID="0238cd388ba680a38261ff524effabf6eb3f5c23b4fa6ad76b42367e73b94275" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.149726 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerStarted","Data":"d61e4700d7dd46bf54cb41f484b010f1b1b279cc34d80725caa2e387219bbe8b"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.149764 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerStarted","Data":"ed6a9d44fc4838688c525502eec1af1918986c1b00e56218c1838d1684b78111"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.151626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerStarted","Data":"00cd22cfbd7690b23506a55b55085f3deb61172a61c332f09aad3e95bb5e8a18"} Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.164802 4744 scope.go:117] "RemoveContainer" containerID="d9359028d08e29da98bd43eaa9ae47f57140e8a5ed0ffb677e05f84587705c2d" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.194887 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.196300 4744 scope.go:117] "RemoveContainer" containerID="e39a4e1e73d4df93b17475b242b8932a88bcad5ab937ba78200e62915ecf60bd" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.212857 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223054 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:28 crc kubenswrapper[4744]: E0311 01:16:28.223388 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="sg-core" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223405 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="sg-core" Mar 11 01:16:28 crc kubenswrapper[4744]: E0311 01:16:28.223412 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-central-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223419 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-central-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: E0311 01:16:28.223433 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="proxy-httpd" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223440 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="proxy-httpd" Mar 11 01:16:28 crc kubenswrapper[4744]: E0311 01:16:28.223470 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-notification-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223475 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-notification-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223668 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="proxy-httpd" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223690 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-notification-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223698 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="ceilometer-central-agent" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.223707 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" containerName="sg-core" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.225723 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.231175 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.231250 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.244149 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.263835 4744 scope.go:117] "RemoveContainer" containerID="8b14126910c00367d3da794d22855618e3bf024fd2f95595dca6f23a35ae6f53" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306483 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306571 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswsv\" (UniqueName: \"kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306641 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.306658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408074 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408165 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408187 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswsv\" (UniqueName: \"kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408253 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408705 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.408782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.411959 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.412530 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.416719 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.421266 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.424967 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswsv\" (UniqueName: \"kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv\") pod \"ceilometer-0\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " pod="openstack/ceilometer-0" Mar 11 01:16:28 crc kubenswrapper[4744]: I0311 01:16:28.661639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.119946 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:29 crc kubenswrapper[4744]: W0311 01:16:29.141627 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d576b4_3837_45a7_ae82_8790466a57e5.slice/crio-3061e37bb86008ff6e84bd72723957d4267cff2712139a6e3431df7cca6179b3 WatchSource:0}: Error finding container 3061e37bb86008ff6e84bd72723957d4267cff2712139a6e3431df7cca6179b3: Status 404 returned error can't find the container with id 3061e37bb86008ff6e84bd72723957d4267cff2712139a6e3431df7cca6179b3 Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.171676 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerStarted","Data":"3061e37bb86008ff6e84bd72723957d4267cff2712139a6e3431df7cca6179b3"} Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.173382 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" event={"ID":"08522fac-9099-4781-95f9-6f676a290be9","Type":"ContainerStarted","Data":"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea"} Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.173594 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.175546 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerStarted","Data":"d1efc38f5e4775b9528fc73d86053b2b60c8851815b28fd27da85a849a8f9358"} Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.176319 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.176362 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.193070 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" podStartSLOduration=3.193043853 podStartE2EDuration="3.193043853s" podCreationTimestamp="2026-03-11 01:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:29.190198025 +0000 UTC m=+1345.994415630" watchObservedRunningTime="2026-03-11 01:16:29.193043853 +0000 UTC m=+1345.997261458" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.216177 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7bcd5444fb-98mrs" podStartSLOduration=3.216161998 podStartE2EDuration="3.216161998s" podCreationTimestamp="2026-03-11 01:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:29.20491685 +0000 UTC m=+1346.009134455" watchObservedRunningTime="2026-03-11 01:16:29.216161998 +0000 UTC m=+1346.020379593" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.746819 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.748505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.751082 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.751361 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.775992 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.834621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.834679 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqb9d\" (UniqueName: \"kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.834935 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.834986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.835098 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.835147 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.835166 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937303 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqb9d\" (UniqueName: \"kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937365 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937386 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.937459 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.939058 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.946791 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.946958 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.947704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.947960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.952092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.963179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqb9d\" (UniqueName: \"kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d\") pod \"barbican-api-7c68976bb4-299gh\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:29 crc kubenswrapper[4744]: I0311 01:16:29.997584 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddeba13d-7885-44ef-8454-1a7b6ef48303" path="/var/lib/kubelet/pods/ddeba13d-7885-44ef-8454-1a7b6ef48303/volumes" Mar 11 01:16:30 crc kubenswrapper[4744]: I0311 01:16:30.065345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:30 crc kubenswrapper[4744]: I0311 01:16:30.667875 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.113829 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.127372 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.213183 4744 generic.go:334] "Generic (PLEG): container finished" podID="5df37d98-3dbc-4977-add0-525bda3d679b" containerID="0f9b7029580d55c5f783ab75363f97ad48322c0595bdb273cac92c6d226c63be" exitCode=0 Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.213249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqbzq" event={"ID":"5df37d98-3dbc-4977-add0-525bda3d679b","Type":"ContainerDied","Data":"0f9b7029580d55c5f783ab75363f97ad48322c0595bdb273cac92c6d226c63be"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.221544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerStarted","Data":"9a1f61db53cc92beec4b876c5f655cbd7b0389b0b62b4ff5bc3cc4d0c15ec01a"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.221579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerStarted","Data":"0f15de8f909facf4d53d7ef48aa1e9dab867da2d0c8b16f8059f69e45a208436"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.232479 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerStarted","Data":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.243732 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerStarted","Data":"0a9c241eccc912eedc93ca498ce49b3438428bba9d63396a61de516e8822d3fa"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.243775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerStarted","Data":"eb1793ab7701df1a2087e9693648ed7fc1c77b3aadb94a5a3e17509e9cd8767b"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.270996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerStarted","Data":"347f55903065cc04c7bddba453ffd2605f2f89f0b3744727e7f308c9383fd8e9"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.271036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerStarted","Data":"eb9d234dbc5c2b62e281936ae9110a93aa09d1211656f0987d1551eeec652ae8"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.271046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerStarted","Data":"cd2685be8c2515724b30ea52e7364dd64a1e93a5cdd26d77f671fc74d691afb1"} Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.271058 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.271452 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.276506 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-bc7567ff7-gl658" podStartSLOduration=2.378627805 podStartE2EDuration="5.276489705s" podCreationTimestamp="2026-03-11 01:16:26 +0000 UTC" firstStartedPulling="2026-03-11 01:16:27.513098086 +0000 UTC m=+1344.317315691" lastFinishedPulling="2026-03-11 01:16:30.410959986 +0000 UTC m=+1347.215177591" observedRunningTime="2026-03-11 01:16:31.254268768 +0000 UTC m=+1348.058486373" watchObservedRunningTime="2026-03-11 01:16:31.276489705 +0000 UTC m=+1348.080707310" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.293275 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" podStartSLOduration=2.450266169 podStartE2EDuration="5.293256514s" podCreationTimestamp="2026-03-11 01:16:26 +0000 UTC" firstStartedPulling="2026-03-11 01:16:27.565578578 +0000 UTC m=+1344.369796183" lastFinishedPulling="2026-03-11 01:16:30.408568923 +0000 UTC m=+1347.212786528" observedRunningTime="2026-03-11 01:16:31.282749379 +0000 UTC m=+1348.086966984" watchObservedRunningTime="2026-03-11 01:16:31.293256514 +0000 UTC m=+1348.097474119" Mar 11 01:16:31 crc kubenswrapper[4744]: I0311 01:16:31.361327 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c68976bb4-299gh" podStartSLOduration=2.361309838 podStartE2EDuration="2.361309838s" podCreationTimestamp="2026-03-11 01:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:31.348159311 +0000 UTC m=+1348.152376916" watchObservedRunningTime="2026-03-11 01:16:31.361309838 +0000 UTC m=+1348.165527443" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.283093 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerStarted","Data":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.769982 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907343 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907418 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gc2c\" (UniqueName: \"kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907426 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle\") pod \"5df37d98-3dbc-4977-add0-525bda3d679b\" (UID: \"5df37d98-3dbc-4977-add0-525bda3d679b\") " Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.907962 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5df37d98-3dbc-4977-add0-525bda3d679b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.913683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c" (OuterVolumeSpecName: "kube-api-access-6gc2c") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "kube-api-access-6gc2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.917628 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.941862 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts" (OuterVolumeSpecName: "scripts") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.944618 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:32 crc kubenswrapper[4744]: I0311 01:16:32.976234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data" (OuterVolumeSpecName: "config-data") pod "5df37d98-3dbc-4977-add0-525bda3d679b" (UID: "5df37d98-3dbc-4977-add0-525bda3d679b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.010007 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.010043 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.010055 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gc2c\" (UniqueName: \"kubernetes.io/projected/5df37d98-3dbc-4977-add0-525bda3d679b-kube-api-access-6gc2c\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.010064 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.010074 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df37d98-3dbc-4977-add0-525bda3d679b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.292599 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hqbzq" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.292634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hqbzq" event={"ID":"5df37d98-3dbc-4977-add0-525bda3d679b","Type":"ContainerDied","Data":"b3eebb7973fbdf8d377cc2adc2dacbd7c35395418b497fb08767391987764293"} Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.293586 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3eebb7973fbdf8d377cc2adc2dacbd7c35395418b497fb08767391987764293" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.299681 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerStarted","Data":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.558472 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:33 crc kubenswrapper[4744]: E0311 01:16:33.558836 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" containerName="cinder-db-sync" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.558852 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" containerName="cinder-db-sync" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.559044 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" containerName="cinder-db-sync" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.559916 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.572378 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.572584 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nwhd4" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.582379 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.582604 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.590961 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.687973 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.688186 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="dnsmasq-dns" containerID="cri-o://6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea" gracePeriod=10 Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.691266 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.719573 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.719884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.720035 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.720141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.720312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.720475 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4728\" (UniqueName: \"kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.736121 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.737450 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.773128 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.825826 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826267 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826339 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrndz\" (UniqueName: \"kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4728\" (UniqueName: \"kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826736 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826856 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.826935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.827043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.831362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.831597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.832860 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.834592 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.835056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.848557 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.851048 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.854662 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.858313 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4728\" (UniqueName: \"kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728\") pod \"cinder-scheduler-0\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.872252 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.885332 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928868 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrndz\" (UniqueName: \"kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928926 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928943 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.928973 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929001 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5l54\" (UniqueName: \"kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929035 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929124 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929159 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.929975 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.930751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.931136 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.932981 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.933195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:33 crc kubenswrapper[4744]: I0311 01:16:33.948839 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrndz\" (UniqueName: \"kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz\") pod \"dnsmasq-dns-86dc97b969-rxvv5\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.032228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.032282 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.032350 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5l54\" (UniqueName: \"kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.032792 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.035272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.035657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.035684 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.035800 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.036080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.038954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.040143 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.049785 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.050012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.054068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5l54\" (UniqueName: \"kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54\") pod \"cinder-api-0\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " pod="openstack/cinder-api-0" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.059752 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.231972 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:34 crc kubenswrapper[4744]: I0311 01:16:34.246435 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.265849 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.312891 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.318819 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.319476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.319572 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.319860 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.319948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pdjs\" (UniqueName: \"kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.320028 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config\") pod \"08522fac-9099-4781-95f9-6f676a290be9\" (UID: \"08522fac-9099-4781-95f9-6f676a290be9\") " Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.351023 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerStarted","Data":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.351975 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.352084 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs" (OuterVolumeSpecName: "kube-api-access-9pdjs") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "kube-api-access-9pdjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.362888 4744 generic.go:334] "Generic (PLEG): container finished" podID="08522fac-9099-4781-95f9-6f676a290be9" containerID="6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea" exitCode=0 Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.362927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" event={"ID":"08522fac-9099-4781-95f9-6f676a290be9","Type":"ContainerDied","Data":"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea"} Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.362951 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" event={"ID":"08522fac-9099-4781-95f9-6f676a290be9","Type":"ContainerDied","Data":"c02f08b8b3659d48c981b4b479107e9ab360e41029f7b0a91796f624ce4e4c60"} Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.362966 4744 scope.go:117] "RemoveContainer" containerID="6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.363073 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dzk6k" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.369767 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.608330262 podStartE2EDuration="7.369752263s" podCreationTimestamp="2026-03-11 01:16:28 +0000 UTC" firstStartedPulling="2026-03-11 01:16:29.146269767 +0000 UTC m=+1345.950487372" lastFinishedPulling="2026-03-11 01:16:34.907691768 +0000 UTC m=+1351.711909373" observedRunningTime="2026-03-11 01:16:35.368232535 +0000 UTC m=+1352.172450140" watchObservedRunningTime="2026-03-11 01:16:35.369752263 +0000 UTC m=+1352.173969868" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.390790 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config" (OuterVolumeSpecName: "config") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.401438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.414934 4744 scope.go:117] "RemoveContainer" containerID="0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.423089 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.424288 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.424306 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.424318 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pdjs\" (UniqueName: \"kubernetes.io/projected/08522fac-9099-4781-95f9-6f676a290be9-kube-api-access-9pdjs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.424329 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.435935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.453439 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08522fac-9099-4781-95f9-6f676a290be9" (UID: "08522fac-9099-4781-95f9-6f676a290be9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.487441 4744 scope.go:117] "RemoveContainer" containerID="6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.489861 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:16:35 crc kubenswrapper[4744]: E0311 01:16:35.491693 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea\": container with ID starting with 6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea not found: ID does not exist" containerID="6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.491726 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea"} err="failed to get container status \"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea\": rpc error: code = NotFound desc = could not find container \"6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea\": container with ID starting with 6e7e27e5b5fd33d5a3976f2ec72f0f701f3f6ebcb83f1f5aac4143095414dcea not found: ID does not exist" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.491755 4744 scope.go:117] "RemoveContainer" containerID="0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875" Mar 11 01:16:35 crc kubenswrapper[4744]: E0311 01:16:35.494479 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875\": container with ID starting with 0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875 not found: ID does not exist" containerID="0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.494537 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875"} err="failed to get container status \"0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875\": rpc error: code = NotFound desc = could not find container \"0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875\": container with ID starting with 0c55cc287fb3e5d57238f33c9294222297b91b2d03b3f2cd6d088c5778dd5875 not found: ID does not exist" Mar 11 01:16:35 crc kubenswrapper[4744]: W0311 01:16:35.496352 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0da3325_8792_44bf_8c25_ea1648998ce0.slice/crio-4d1d6320f4a5a49cef0e6a90e4d4eb21c6a7f97d68920e4d072b31cf62d39bef WatchSource:0}: Error finding container 4d1d6320f4a5a49cef0e6a90e4d4eb21c6a7f97d68920e4d072b31cf62d39bef: Status 404 returned error can't find the container with id 4d1d6320f4a5a49cef0e6a90e4d4eb21c6a7f97d68920e4d072b31cf62d39bef Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.526071 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.526338 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08522fac-9099-4781-95f9-6f676a290be9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.571770 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:35 crc kubenswrapper[4744]: W0311 01:16:35.596173 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55bfa6db_7b61_4f7b_a860_197d4f32ba8b.slice/crio-93a6cd7513e17aef8eba0907d0bb348bfd4a0b90d4636accddc148e65cf4e89a WatchSource:0}: Error finding container 93a6cd7513e17aef8eba0907d0bb348bfd4a0b90d4636accddc148e65cf4e89a: Status 404 returned error can't find the container with id 93a6cd7513e17aef8eba0907d0bb348bfd4a0b90d4636accddc148e65cf4e89a Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.692110 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.700377 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dzk6k"] Mar 11 01:16:35 crc kubenswrapper[4744]: I0311 01:16:35.995097 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08522fac-9099-4781-95f9-6f676a290be9" path="/var/lib/kubelet/pods/08522fac-9099-4781-95f9-6f676a290be9/volumes" Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.243036 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.371630 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerID="1bcc4d3e8cc07e39ed1ba82d03612482772b57681aefd2a6680379ab70250b00" exitCode=0 Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.371685 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" event={"ID":"f0da3325-8792-44bf-8c25-ea1648998ce0","Type":"ContainerDied","Data":"1bcc4d3e8cc07e39ed1ba82d03612482772b57681aefd2a6680379ab70250b00"} Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.371709 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" event={"ID":"f0da3325-8792-44bf-8c25-ea1648998ce0","Type":"ContainerStarted","Data":"4d1d6320f4a5a49cef0e6a90e4d4eb21c6a7f97d68920e4d072b31cf62d39bef"} Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.374111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerStarted","Data":"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842"} Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.374135 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerStarted","Data":"de6c053276b16d518ab4d706b0439534cb771255f3ce4e473709047dc0eaaaa7"} Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.379397 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerStarted","Data":"93a6cd7513e17aef8eba0907d0bb348bfd4a0b90d4636accddc148e65cf4e89a"} Mar 11 01:16:36 crc kubenswrapper[4744]: I0311 01:16:36.549904 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.077305 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.412037 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerStarted","Data":"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b"} Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.415316 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.412272 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api" containerID="cri-o://fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" gracePeriod=30 Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.412068 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api-log" containerID="cri-o://7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" gracePeriod=30 Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.422779 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerStarted","Data":"4d56a62786b00c64a34dc6d390736c59715ec52588297c8759a2ebb5e25ed9a2"} Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.426657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" event={"ID":"f0da3325-8792-44bf-8c25-ea1648998ce0","Type":"ContainerStarted","Data":"61f19b7521fd2006eb5a4290593fd4b6486d40a70ee7549d87d0824f9c940f4b"} Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.426702 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.457344 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.457329582 podStartE2EDuration="4.457329582s" podCreationTimestamp="2026-03-11 01:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:37.433579449 +0000 UTC m=+1354.237797064" watchObservedRunningTime="2026-03-11 01:16:37.457329582 +0000 UTC m=+1354.261547177" Mar 11 01:16:37 crc kubenswrapper[4744]: I0311 01:16:37.465663 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" podStartSLOduration=4.465649179 podStartE2EDuration="4.465649179s" podCreationTimestamp="2026-03-11 01:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:37.456384623 +0000 UTC m=+1354.260602228" watchObservedRunningTime="2026-03-11 01:16:37.465649179 +0000 UTC m=+1354.269866774" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.123044 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.294763 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.294801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.294899 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.294968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.294992 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5l54\" (UniqueName: \"kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.295034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.295064 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs\") pod \"fb474add-909e-4c17-814e-186f40b3faac\" (UID: \"fb474add-909e-4c17-814e-186f40b3faac\") " Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.295101 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.295383 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb474add-909e-4c17-814e-186f40b3faac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.295460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs" (OuterVolumeSpecName: "logs") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.306656 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54" (OuterVolumeSpecName: "kube-api-access-d5l54") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "kube-api-access-d5l54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.306656 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.306969 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts" (OuterVolumeSpecName: "scripts") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.345420 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data" (OuterVolumeSpecName: "config-data") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.345531 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb474add-909e-4c17-814e-186f40b3faac" (UID: "fb474add-909e-4c17-814e-186f40b3faac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397048 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5l54\" (UniqueName: \"kubernetes.io/projected/fb474add-909e-4c17-814e-186f40b3faac-kube-api-access-d5l54\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397083 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397099 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb474add-909e-4c17-814e-186f40b3faac-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397113 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397125 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.397136 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb474add-909e-4c17-814e-186f40b3faac-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435538 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb474add-909e-4c17-814e-186f40b3faac" containerID="fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" exitCode=0 Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435564 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb474add-909e-4c17-814e-186f40b3faac" containerID="7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" exitCode=143 Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435586 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435615 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerDied","Data":"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b"} Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435650 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerDied","Data":"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842"} Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435660 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fb474add-909e-4c17-814e-186f40b3faac","Type":"ContainerDied","Data":"de6c053276b16d518ab4d706b0439534cb771255f3ce4e473709047dc0eaaaa7"} Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.435675 4744 scope.go:117] "RemoveContainer" containerID="fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.438194 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerStarted","Data":"6f4a242e298f9b8510414796fdb1e3a9bb73ced97fb7699bc14f64649589cf10"} Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.467825 4744 scope.go:117] "RemoveContainer" containerID="7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.468552 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.606617398 podStartE2EDuration="5.468535525s" podCreationTimestamp="2026-03-11 01:16:33 +0000 UTC" firstStartedPulling="2026-03-11 01:16:35.59867607 +0000 UTC m=+1352.402893675" lastFinishedPulling="2026-03-11 01:16:36.460594197 +0000 UTC m=+1353.264811802" observedRunningTime="2026-03-11 01:16:38.466195273 +0000 UTC m=+1355.270412878" watchObservedRunningTime="2026-03-11 01:16:38.468535525 +0000 UTC m=+1355.272753130" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.510567 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.518151 4744 scope.go:117] "RemoveContainer" containerID="fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.519606 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b\": container with ID starting with fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b not found: ID does not exist" containerID="fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.519727 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b"} err="failed to get container status \"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b\": rpc error: code = NotFound desc = could not find container \"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b\": container with ID starting with fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b not found: ID does not exist" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.519810 4744 scope.go:117] "RemoveContainer" containerID="7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.520025 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.522622 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842\": container with ID starting with 7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842 not found: ID does not exist" containerID="7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.522807 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842"} err="failed to get container status \"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842\": rpc error: code = NotFound desc = could not find container \"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842\": container with ID starting with 7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842 not found: ID does not exist" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.522898 4744 scope.go:117] "RemoveContainer" containerID="fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.526607 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b"} err="failed to get container status \"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b\": rpc error: code = NotFound desc = could not find container \"fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b\": container with ID starting with fac0d6881c7284067322ae59fec727eca555c45774256548a94d8b85e068c68b not found: ID does not exist" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.526786 4744 scope.go:117] "RemoveContainer" containerID="7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.528757 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842"} err="failed to get container status \"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842\": rpc error: code = NotFound desc = could not find container \"7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842\": container with ID starting with 7167b902f87f5a45d4f9d66ea2d3782bf051df68266ae47dca07c32bf1e8f842 not found: ID does not exist" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.530874 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.551718 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="dnsmasq-dns" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.551750 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="dnsmasq-dns" Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.551787 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="init" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.551796 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="init" Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.551806 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.551814 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api" Mar 11 01:16:38 crc kubenswrapper[4744]: E0311 01:16:38.551843 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api-log" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.551850 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api-log" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.552098 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08522fac-9099-4781-95f9-6f676a290be9" containerName="dnsmasq-dns" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.552117 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.552131 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb474add-909e-4c17-814e-186f40b3faac" containerName="cinder-api-log" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.553121 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.553205 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.557650 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.557914 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.560208 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702145 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702393 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702678 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwpz8\" (UniqueName: \"kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.702965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.703128 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.703173 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804452 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwpz8\" (UniqueName: \"kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.804660 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.805904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.806474 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.812228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.817498 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.818055 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.819492 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.822262 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.826333 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.827911 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwpz8\" (UniqueName: \"kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8\") pod \"cinder-api-0\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.879620 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:16:38 crc kubenswrapper[4744]: I0311 01:16:38.886743 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.009392 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.084952 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.098006 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7bcd5444fb-98mrs" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api" containerID="cri-o://d1efc38f5e4775b9528fc73d86053b2b60c8851815b28fd27da85a849a8f9358" gracePeriod=30 Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.097845 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7bcd5444fb-98mrs" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api-log" containerID="cri-o://d61e4700d7dd46bf54cb41f484b010f1b1b279cc34d80725caa2e387219bbe8b" gracePeriod=30 Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.375630 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.430801 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.470906 4744 generic.go:334] "Generic (PLEG): container finished" podID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerID="d61e4700d7dd46bf54cb41f484b010f1b1b279cc34d80725caa2e387219bbe8b" exitCode=143 Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.470981 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerDied","Data":"d61e4700d7dd46bf54cb41f484b010f1b1b279cc34d80725caa2e387219bbe8b"} Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.474409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerStarted","Data":"9558ffd0f1554b64add098d174b47d8e3e27443e41618f4f6bfcf13fbb94142b"} Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.694181 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.694422 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77cdc8f7b7-2kx6j" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-api" containerID="cri-o://a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9" gracePeriod=30 Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.696182 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77cdc8f7b7-2kx6j" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" containerID="cri-o://5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411" gracePeriod=30 Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.714538 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77cdc8f7b7-2kx6j" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:39910->10.217.0.159:9696: read: connection reset by peer" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.745701 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.758495 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.771323 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935123 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935195 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjbr\" (UniqueName: \"kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935255 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.935482 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:39 crc kubenswrapper[4744]: I0311 01:16:39.991649 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb474add-909e-4c17-814e-186f40b3faac" path="/var/lib/kubelet/pods/fb474add-909e-4c17-814e-186f40b3faac/volumes" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.037201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjbr\" (UniqueName: \"kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.037500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.037723 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.037912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.038115 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.038175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.038201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.044231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.044691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.045198 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.045370 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.045840 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.046272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.062633 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjbr\" (UniqueName: \"kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr\") pod \"neutron-5959cf6645-bcjjf\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.110051 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.486062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerStarted","Data":"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad"} Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.487787 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerID="5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411" exitCode=0 Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.488589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerDied","Data":"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411"} Mar 11 01:16:40 crc kubenswrapper[4744]: I0311 01:16:40.666744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:16:40 crc kubenswrapper[4744]: W0311 01:16:40.678791 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb4eb051_94b3_42d1_87ff_669ad8251b4f.slice/crio-f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5 WatchSource:0}: Error finding container f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5: Status 404 returned error can't find the container with id f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5 Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.496490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerStarted","Data":"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9"} Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.496993 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.498331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerStarted","Data":"90d7864d9f44afb54c45c8f83816565799a968b37e50716fa7d940d17e944838"} Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.498380 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerStarted","Data":"b1b1e7e9a3f9e195c5c8ffc0f9ba222b2dd152ff67cad9587e2f46e9f7c8f240"} Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.498391 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerStarted","Data":"f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5"} Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.498487 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:16:41 crc kubenswrapper[4744]: I0311 01:16:41.521460 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.5214398879999997 podStartE2EDuration="3.521439888s" podCreationTimestamp="2026-03-11 01:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:41.514711061 +0000 UTC m=+1358.318928666" watchObservedRunningTime="2026-03-11 01:16:41.521439888 +0000 UTC m=+1358.325657493" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.287604 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7bcd5444fb-98mrs" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:56328->10.217.0.164:9311: read: connection reset by peer" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.287916 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7bcd5444fb-98mrs" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:56330->10.217.0.164:9311: read: connection reset by peer" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.408874 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.409133 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.435003 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.435088 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.466913 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5959cf6645-bcjjf" podStartSLOduration=3.466894769 podStartE2EDuration="3.466894769s" podCreationTimestamp="2026-03-11 01:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:41.542908483 +0000 UTC m=+1358.347126088" watchObservedRunningTime="2026-03-11 01:16:42.466894769 +0000 UTC m=+1359.271112374" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.523149 4744 generic.go:334] "Generic (PLEG): container finished" podID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerID="d1efc38f5e4775b9528fc73d86053b2b60c8851815b28fd27da85a849a8f9358" exitCode=0 Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.524115 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerDied","Data":"d1efc38f5e4775b9528fc73d86053b2b60c8851815b28fd27da85a849a8f9358"} Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.693727 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.695458 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.719843 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.769121 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.779384 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77cdc8f7b7-2kx6j" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799485 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rcj\" (UniqueName: \"kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799633 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.799653 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.800338 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom\") pod \"b6e79975-841a-4d71-9906-e53607f1b3fb\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902292 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle\") pod \"b6e79975-841a-4d71-9906-e53607f1b3fb\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs\") pod \"b6e79975-841a-4d71-9906-e53607f1b3fb\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data\") pod \"b6e79975-841a-4d71-9906-e53607f1b3fb\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902446 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfh5t\" (UniqueName: \"kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t\") pod \"b6e79975-841a-4d71-9906-e53607f1b3fb\" (UID: \"b6e79975-841a-4d71-9906-e53607f1b3fb\") " Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902739 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902761 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97rcj\" (UniqueName: \"kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902798 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902852 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.902922 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.903456 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.903894 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs" (OuterVolumeSpecName: "logs") pod "b6e79975-841a-4d71-9906-e53607f1b3fb" (UID: "b6e79975-841a-4d71-9906-e53607f1b3fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.909081 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t" (OuterVolumeSpecName: "kube-api-access-dfh5t") pod "b6e79975-841a-4d71-9906-e53607f1b3fb" (UID: "b6e79975-841a-4d71-9906-e53607f1b3fb"). InnerVolumeSpecName "kube-api-access-dfh5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.909257 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6e79975-841a-4d71-9906-e53607f1b3fb" (UID: "b6e79975-841a-4d71-9906-e53607f1b3fb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.909482 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.909623 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.909445 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.910689 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.912213 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.929867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6e79975-841a-4d71-9906-e53607f1b3fb" (UID: "b6e79975-841a-4d71-9906-e53607f1b3fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.932115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rcj\" (UniqueName: \"kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj\") pod \"placement-7f78d57d44-gt8df\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:42 crc kubenswrapper[4744]: I0311 01:16:42.957766 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data" (OuterVolumeSpecName: "config-data") pod "b6e79975-841a-4d71-9906-e53607f1b3fb" (UID: "b6e79975-841a-4d71-9906-e53607f1b3fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.004141 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.004333 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.004399 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6e79975-841a-4d71-9906-e53607f1b3fb-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.004459 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e79975-841a-4d71-9906-e53607f1b3fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.004532 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfh5t\" (UniqueName: \"kubernetes.io/projected/b6e79975-841a-4d71-9906-e53607f1b3fb-kube-api-access-dfh5t\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.081304 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.535357 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7bcd5444fb-98mrs" event={"ID":"b6e79975-841a-4d71-9906-e53607f1b3fb","Type":"ContainerDied","Data":"ed6a9d44fc4838688c525502eec1af1918986c1b00e56218c1838d1684b78111"} Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.535434 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7bcd5444fb-98mrs" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.535887 4744 scope.go:117] "RemoveContainer" containerID="d1efc38f5e4775b9528fc73d86053b2b60c8851815b28fd27da85a849a8f9358" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.590272 4744 scope.go:117] "RemoveContainer" containerID="d61e4700d7dd46bf54cb41f484b010f1b1b279cc34d80725caa2e387219bbe8b" Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.592349 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.603362 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7bcd5444fb-98mrs"] Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.626851 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:16:43 crc kubenswrapper[4744]: I0311 01:16:43.987611 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" path="/var/lib/kubelet/pods/b6e79975-841a-4d71-9906-e53607f1b3fb/volumes" Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.061717 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.148207 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.148423 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="dnsmasq-dns" containerID="cri-o://771097dfdfd9beb6cb610bfc57db5e2fe0f8386f0ff8e2f93f74df2d8bfdf38c" gracePeriod=10 Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.323854 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.364767 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.412560 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.549073 4744 generic.go:334] "Generic (PLEG): container finished" podID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerID="771097dfdfd9beb6cb610bfc57db5e2fe0f8386f0ff8e2f93f74df2d8bfdf38c" exitCode=0 Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.549139 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" event={"ID":"909cbc8d-726a-427a-9233-c5c3ea5387f0","Type":"ContainerDied","Data":"771097dfdfd9beb6cb610bfc57db5e2fe0f8386f0ff8e2f93f74df2d8bfdf38c"} Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.551552 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerStarted","Data":"d932b2416d71a86f80ef3581b5216ce6ba10ab543bf647b40daeebe0c83edbaa"} Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.551605 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerStarted","Data":"8485a4680d7e9bd7479321ad2c60fbdd63c7941a99f12f5159677293b38ca6db"} Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.551620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerStarted","Data":"10ffbd2dc9aa299a1bcdfba5f1bc20555f968f1ff6c4e1e8f6e696a4f9bc8bfe"} Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.551762 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="cinder-scheduler" containerID="cri-o://4d56a62786b00c64a34dc6d390736c59715ec52588297c8759a2ebb5e25ed9a2" gracePeriod=30 Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.551947 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="probe" containerID="cri-o://6f4a242e298f9b8510414796fdb1e3a9bb73ced97fb7699bc14f64649589cf10" gracePeriod=30 Mar 11 01:16:44 crc kubenswrapper[4744]: I0311 01:16:44.583968 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f78d57d44-gt8df" podStartSLOduration=2.583954149 podStartE2EDuration="2.583954149s" podCreationTimestamp="2026-03-11 01:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:44.58104279 +0000 UTC m=+1361.385260395" watchObservedRunningTime="2026-03-11 01:16:44.583954149 +0000 UTC m=+1361.388171754" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.094851 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.209123 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260413 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmtsh\" (UniqueName: \"kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260652 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260692 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.260725 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config\") pod \"e5f6b131-1501-45aa-8ebf-76f7d454baad\" (UID: \"e5f6b131-1501-45aa-8ebf-76f7d454baad\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.270191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.275050 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh" (OuterVolumeSpecName: "kube-api-access-xmtsh") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "kube-api-access-xmtsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.347712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config" (OuterVolumeSpecName: "config") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.347877 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.349418 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.350118 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.355687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5f6b131-1501-45aa-8ebf-76f7d454baad" (UID: "e5f6b131-1501-45aa-8ebf-76f7d454baad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364183 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z59nv\" (UniqueName: \"kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364367 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364457 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364501 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb\") pod \"909cbc8d-726a-427a-9233-c5c3ea5387f0\" (UID: \"909cbc8d-726a-427a-9233-c5c3ea5387f0\") " Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364837 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmtsh\" (UniqueName: \"kubernetes.io/projected/e5f6b131-1501-45aa-8ebf-76f7d454baad-kube-api-access-xmtsh\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364851 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364860 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364869 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364878 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364885 4744 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.364893 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f6b131-1501-45aa-8ebf-76f7d454baad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.371472 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv" (OuterVolumeSpecName: "kube-api-access-z59nv") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "kube-api-access-z59nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.407381 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.408062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.413462 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config" (OuterVolumeSpecName: "config") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.421551 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.428153 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "909cbc8d-726a-427a-9233-c5c3ea5387f0" (UID: "909cbc8d-726a-427a-9233-c5c3ea5387f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.466411 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.466440 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.466451 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.467062 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.467109 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/909cbc8d-726a-427a-9233-c5c3ea5387f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.467130 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z59nv\" (UniqueName: \"kubernetes.io/projected/909cbc8d-726a-427a-9233-c5c3ea5387f0-kube-api-access-z59nv\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.565784 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerID="a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9" exitCode=0 Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.565949 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerDied","Data":"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9"} Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.566032 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cdc8f7b7-2kx6j" event={"ID":"e5f6b131-1501-45aa-8ebf-76f7d454baad","Type":"ContainerDied","Data":"630a474957df816c074be1456412a7125e8edd61e818cd9a41b2148ac79e6da3"} Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.566173 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cdc8f7b7-2kx6j" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.566543 4744 scope.go:117] "RemoveContainer" containerID="5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.568686 4744 generic.go:334] "Generic (PLEG): container finished" podID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerID="6f4a242e298f9b8510414796fdb1e3a9bb73ced97fb7699bc14f64649589cf10" exitCode=0 Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.568804 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerDied","Data":"6f4a242e298f9b8510414796fdb1e3a9bb73ced97fb7699bc14f64649589cf10"} Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.572618 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" event={"ID":"909cbc8d-726a-427a-9233-c5c3ea5387f0","Type":"ContainerDied","Data":"155affd04de146537333c3fb8bbda31378a7034a7ab95b80e6ca56be28dcd98a"} Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.572859 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.572886 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-mznwk" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.572894 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.605702 4744 scope.go:117] "RemoveContainer" containerID="a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.635245 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.646132 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77cdc8f7b7-2kx6j"] Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.655356 4744 scope.go:117] "RemoveContainer" containerID="5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411" Mar 11 01:16:45 crc kubenswrapper[4744]: E0311 01:16:45.655741 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411\": container with ID starting with 5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411 not found: ID does not exist" containerID="5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.655797 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411"} err="failed to get container status \"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411\": rpc error: code = NotFound desc = could not find container \"5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411\": container with ID starting with 5334318444a1f510d9eb0205e5cb310212f8727d01dd9f4d646fb2022768d411 not found: ID does not exist" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.655835 4744 scope.go:117] "RemoveContainer" containerID="a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9" Mar 11 01:16:45 crc kubenswrapper[4744]: E0311 01:16:45.656162 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9\": container with ID starting with a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9 not found: ID does not exist" containerID="a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.656194 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9"} err="failed to get container status \"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9\": rpc error: code = NotFound desc = could not find container \"a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9\": container with ID starting with a2626049f1cb7982d192118248ec35ce41865252920a41c7b1b441c5781153f9 not found: ID does not exist" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.656215 4744 scope.go:117] "RemoveContainer" containerID="771097dfdfd9beb6cb610bfc57db5e2fe0f8386f0ff8e2f93f74df2d8bfdf38c" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.666131 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.673756 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-mznwk"] Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.682922 4744 scope.go:117] "RemoveContainer" containerID="87ce7c98f7b870db615d895adf8ea71c02d6b72c27928c9990d36c42e05a242b" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.989411 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" path="/var/lib/kubelet/pods/909cbc8d-726a-427a-9233-c5c3ea5387f0/volumes" Mar 11 01:16:45 crc kubenswrapper[4744]: I0311 01:16:45.990946 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" path="/var/lib/kubelet/pods/e5f6b131-1501-45aa-8ebf-76f7d454baad/volumes" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.043327 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.608406 4744 generic.go:334] "Generic (PLEG): container finished" podID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerID="4d56a62786b00c64a34dc6d390736c59715ec52588297c8759a2ebb5e25ed9a2" exitCode=0 Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.608807 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerDied","Data":"4d56a62786b00c64a34dc6d390736c59715ec52588297c8759a2ebb5e25ed9a2"} Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.736568 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846328 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846580 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846622 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4728\" (UniqueName: \"kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846714 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.846768 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id\") pod \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\" (UID: \"55bfa6db-7b61-4f7b-a860-197d4f32ba8b\") " Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.847165 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.853665 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.853758 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728" (OuterVolumeSpecName: "kube-api-access-h4728") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "kube-api-access-h4728". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.864307 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts" (OuterVolumeSpecName: "scripts") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.904755 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.948611 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.948656 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.948669 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.948680 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.948693 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4728\" (UniqueName: \"kubernetes.io/projected/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-kube-api-access-h4728\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:48 crc kubenswrapper[4744]: I0311 01:16:48.962668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data" (OuterVolumeSpecName: "config-data") pod "55bfa6db-7b61-4f7b-a860-197d4f32ba8b" (UID: "55bfa6db-7b61-4f7b-a860-197d4f32ba8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.050031 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bfa6db-7b61-4f7b-a860-197d4f32ba8b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.617920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55bfa6db-7b61-4f7b-a860-197d4f32ba8b","Type":"ContainerDied","Data":"93a6cd7513e17aef8eba0907d0bb348bfd4a0b90d4636accddc148e65cf4e89a"} Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.617983 4744 scope.go:117] "RemoveContainer" containerID="6f4a242e298f9b8510414796fdb1e3a9bb73ced97fb7699bc14f64649589cf10" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.618124 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.635375 4744 scope.go:117] "RemoveContainer" containerID="4d56a62786b00c64a34dc6d390736c59715ec52588297c8759a2ebb5e25ed9a2" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.658612 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.679688 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.688907 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689242 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="dnsmasq-dns" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689257 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="dnsmasq-dns" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689274 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="probe" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689281 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="probe" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689290 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689296 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689306 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689312 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689323 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="init" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689329 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="init" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689348 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-api" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689354 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-api" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689364 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api-log" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689369 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api-log" Mar 11 01:16:49 crc kubenswrapper[4744]: E0311 01:16:49.689378 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="cinder-scheduler" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689384 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="cinder-scheduler" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689547 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-httpd" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689562 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689571 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="probe" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689579 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e79975-841a-4d71-9906-e53607f1b3fb" containerName="barbican-api-log" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689590 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f6b131-1501-45aa-8ebf-76f7d454baad" containerName="neutron-api" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689598 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="909cbc8d-726a-427a-9233-c5c3ea5387f0" containerName="dnsmasq-dns" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.689613 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" containerName="cinder-scheduler" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.690424 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.694061 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.712679 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.863532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.863641 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x82h\" (UniqueName: \"kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.863816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.864459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.864536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.864642 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.965977 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.966078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.966126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.966189 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.966251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.966291 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x82h\" (UniqueName: \"kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.967464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.970735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.970766 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.972247 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.973931 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.984874 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55bfa6db-7b61-4f7b-a860-197d4f32ba8b" path="/var/lib/kubelet/pods/55bfa6db-7b61-4f7b-a860-197d4f32ba8b/volumes" Mar 11 01:16:49 crc kubenswrapper[4744]: I0311 01:16:49.997627 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x82h\" (UniqueName: \"kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h\") pod \"cinder-scheduler-0\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " pod="openstack/cinder-scheduler-0" Mar 11 01:16:50 crc kubenswrapper[4744]: I0311 01:16:50.004629 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:16:50 crc kubenswrapper[4744]: I0311 01:16:50.475060 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:16:50 crc kubenswrapper[4744]: I0311 01:16:50.630240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerStarted","Data":"bcececd1efdbb237ee0d6c1f37f9d3dbe0f983db79fd2f2dcbd36e9ec1bc9cce"} Mar 11 01:16:50 crc kubenswrapper[4744]: I0311 01:16:50.695964 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 01:16:51 crc kubenswrapper[4744]: I0311 01:16:51.641095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerStarted","Data":"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055"} Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.460527 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.461757 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.465275 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.465831 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.465964 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fzhzn" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.474075 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.536712 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.536805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.536861 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn89v\" (UniqueName: \"kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.536916 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.638022 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.638093 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.638128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn89v\" (UniqueName: \"kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.638161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.639363 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.646985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.647974 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.659472 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn89v\" (UniqueName: \"kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v\") pod \"openstackclient\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.661798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerStarted","Data":"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44"} Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.791159 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.803749 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.803733053 podStartE2EDuration="3.803733053s" podCreationTimestamp="2026-03-11 01:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:52.683218407 +0000 UTC m=+1369.487436012" watchObservedRunningTime="2026-03-11 01:16:52.803733053 +0000 UTC m=+1369.607950658" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.811306 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.818093 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.862866 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.863999 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.876788 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.942289 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.942353 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vth9r\" (UniqueName: \"kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.942380 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: I0311 01:16:52.942441 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:52 crc kubenswrapper[4744]: E0311 01:16:52.957915 4744 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 11 01:16:52 crc kubenswrapper[4744]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54b5d61b-3a00-476b-ad7a-424b57919e9a_0(581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9" Netns:"/var/run/netns/dd620e65-b1d1-4fd5-82f0-2c1dc4400cb7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9;K8S_POD_UID=54b5d61b-3a00-476b-ad7a-424b57919e9a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54b5d61b-3a00-476b-ad7a-424b57919e9a]: expected pod UID "54b5d61b-3a00-476b-ad7a-424b57919e9a" but got "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" from Kube API Mar 11 01:16:52 crc kubenswrapper[4744]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 11 01:16:52 crc kubenswrapper[4744]: > Mar 11 01:16:52 crc kubenswrapper[4744]: E0311 01:16:52.957971 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 11 01:16:52 crc kubenswrapper[4744]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54b5d61b-3a00-476b-ad7a-424b57919e9a_0(581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9" Netns:"/var/run/netns/dd620e65-b1d1-4fd5-82f0-2c1dc4400cb7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=581cf142e46b0ceef35494674b1d15efd228cdd757f9c741ce5328d4e2da81e9;K8S_POD_UID=54b5d61b-3a00-476b-ad7a-424b57919e9a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54b5d61b-3a00-476b-ad7a-424b57919e9a]: expected pod UID "54b5d61b-3a00-476b-ad7a-424b57919e9a" but got "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" from Kube API Mar 11 01:16:52 crc kubenswrapper[4744]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 11 01:16:52 crc kubenswrapper[4744]: > pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.043704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vth9r\" (UniqueName: \"kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.043761 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.043845 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.043930 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.044925 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.048736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.048836 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.063720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vth9r\" (UniqueName: \"kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r\") pod \"openstackclient\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.206132 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.677489 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.683005 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.691145 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54b5d61b-3a00-476b-ad7a-424b57919e9a" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.753880 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.855230 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config\") pod \"54b5d61b-3a00-476b-ad7a-424b57919e9a\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.855417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn89v\" (UniqueName: \"kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v\") pod \"54b5d61b-3a00-476b-ad7a-424b57919e9a\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.855498 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret\") pod \"54b5d61b-3a00-476b-ad7a-424b57919e9a\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.855650 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle\") pod \"54b5d61b-3a00-476b-ad7a-424b57919e9a\" (UID: \"54b5d61b-3a00-476b-ad7a-424b57919e9a\") " Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.855752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "54b5d61b-3a00-476b-ad7a-424b57919e9a" (UID: "54b5d61b-3a00-476b-ad7a-424b57919e9a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.856102 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.863727 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v" (OuterVolumeSpecName: "kube-api-access-qn89v") pod "54b5d61b-3a00-476b-ad7a-424b57919e9a" (UID: "54b5d61b-3a00-476b-ad7a-424b57919e9a"). InnerVolumeSpecName "kube-api-access-qn89v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.864120 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b5d61b-3a00-476b-ad7a-424b57919e9a" (UID: "54b5d61b-3a00-476b-ad7a-424b57919e9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.886639 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "54b5d61b-3a00-476b-ad7a-424b57919e9a" (UID: "54b5d61b-3a00-476b-ad7a-424b57919e9a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.958105 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn89v\" (UniqueName: \"kubernetes.io/projected/54b5d61b-3a00-476b-ad7a-424b57919e9a-kube-api-access-qn89v\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.958130 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.958139 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5d61b-3a00-476b-ad7a-424b57919e9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:53 crc kubenswrapper[4744]: I0311 01:16:53.988484 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b5d61b-3a00-476b-ad7a-424b57919e9a" path="/var/lib/kubelet/pods/54b5d61b-3a00-476b-ad7a-424b57919e9a/volumes" Mar 11 01:16:54 crc kubenswrapper[4744]: E0311 01:16:54.104209 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54b5d61b_3a00_476b_ad7a_424b57919e9a.slice\": RecentStats: unable to find data in memory cache]" Mar 11 01:16:54 crc kubenswrapper[4744]: I0311 01:16:54.685684 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"35ec702b-4aa6-4fa6-a770-ec3caf762d5f","Type":"ContainerStarted","Data":"254b5d84d26672423311ca593c8fbaea7b9fe875fc7a778ac56027267934c2ec"} Mar 11 01:16:54 crc kubenswrapper[4744]: I0311 01:16:54.685712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:16:54 crc kubenswrapper[4744]: I0311 01:16:54.691358 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54b5d61b-3a00-476b-ad7a-424b57919e9a" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.005856 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.927734 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.929128 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-central-agent" containerID="cri-o://b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" gracePeriod=30 Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.929379 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="proxy-httpd" containerID="cri-o://35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" gracePeriod=30 Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.929543 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="sg-core" containerID="cri-o://a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" gracePeriod=30 Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.929683 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-notification-agent" containerID="cri-o://c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" gracePeriod=30 Mar 11 01:16:55 crc kubenswrapper[4744]: I0311 01:16:55.936488 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.087471 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.089684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.092337 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.094157 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.097383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.114904 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgrcg\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217311 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217382 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217402 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.217422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.319413 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.319665 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.319801 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgrcg\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.319885 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.319995 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.320073 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.320137 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.320208 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.320640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.321706 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.325534 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.328165 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.328228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.328261 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.331055 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.339483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgrcg\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg\") pod \"swift-proxy-6bb49fdf95-9g9dz\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.456104 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.668153 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709644 4744 generic.go:334] "Generic (PLEG): container finished" podID="02d576b4-3837-45a7-ae82-8790466a57e5" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" exitCode=0 Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709670 4744 generic.go:334] "Generic (PLEG): container finished" podID="02d576b4-3837-45a7-ae82-8790466a57e5" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" exitCode=2 Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709678 4744 generic.go:334] "Generic (PLEG): container finished" podID="02d576b4-3837-45a7-ae82-8790466a57e5" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" exitCode=0 Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709684 4744 generic.go:334] "Generic (PLEG): container finished" podID="02d576b4-3837-45a7-ae82-8790466a57e5" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" exitCode=0 Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerDied","Data":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709728 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerDied","Data":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709739 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerDied","Data":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerDied","Data":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709755 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02d576b4-3837-45a7-ae82-8790466a57e5","Type":"ContainerDied","Data":"3061e37bb86008ff6e84bd72723957d4267cff2712139a6e3431df7cca6179b3"} Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709769 4744 scope.go:117] "RemoveContainer" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.709911 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.763095 4744 scope.go:117] "RemoveContainer" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.784792 4744 scope.go:117] "RemoveContainer" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.821948 4744 scope.go:117] "RemoveContainer" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830013 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830164 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830186 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830234 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830258 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830298 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.830387 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xswsv\" (UniqueName: \"kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv\") pod \"02d576b4-3837-45a7-ae82-8790466a57e5\" (UID: \"02d576b4-3837-45a7-ae82-8790466a57e5\") " Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.831172 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.831359 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.837732 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts" (OuterVolumeSpecName: "scripts") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.837733 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv" (OuterVolumeSpecName: "kube-api-access-xswsv") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "kube-api-access-xswsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.847911 4744 scope.go:117] "RemoveContainer" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: E0311 01:16:56.848250 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": container with ID starting with 35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478 not found: ID does not exist" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848289 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} err="failed to get container status \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": rpc error: code = NotFound desc = could not find container \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": container with ID starting with 35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848314 4744 scope.go:117] "RemoveContainer" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: E0311 01:16:56.848503 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": container with ID starting with a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d not found: ID does not exist" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848539 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} err="failed to get container status \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": rpc error: code = NotFound desc = could not find container \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": container with ID starting with a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848552 4744 scope.go:117] "RemoveContainer" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: E0311 01:16:56.848710 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": container with ID starting with c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70 not found: ID does not exist" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848731 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} err="failed to get container status \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": rpc error: code = NotFound desc = could not find container \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": container with ID starting with c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848742 4744 scope.go:117] "RemoveContainer" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: E0311 01:16:56.848908 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": container with ID starting with b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b not found: ID does not exist" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848923 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} err="failed to get container status \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": rpc error: code = NotFound desc = could not find container \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": container with ID starting with b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.848935 4744 scope.go:117] "RemoveContainer" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849082 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} err="failed to get container status \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": rpc error: code = NotFound desc = could not find container \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": container with ID starting with 35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849094 4744 scope.go:117] "RemoveContainer" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849227 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} err="failed to get container status \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": rpc error: code = NotFound desc = could not find container \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": container with ID starting with a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849239 4744 scope.go:117] "RemoveContainer" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849370 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} err="failed to get container status \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": rpc error: code = NotFound desc = could not find container \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": container with ID starting with c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849382 4744 scope.go:117] "RemoveContainer" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849527 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} err="failed to get container status \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": rpc error: code = NotFound desc = could not find container \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": container with ID starting with b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849539 4744 scope.go:117] "RemoveContainer" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849684 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} err="failed to get container status \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": rpc error: code = NotFound desc = could not find container \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": container with ID starting with 35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849695 4744 scope.go:117] "RemoveContainer" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849849 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} err="failed to get container status \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": rpc error: code = NotFound desc = could not find container \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": container with ID starting with a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.849863 4744 scope.go:117] "RemoveContainer" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850022 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} err="failed to get container status \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": rpc error: code = NotFound desc = could not find container \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": container with ID starting with c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850037 4744 scope.go:117] "RemoveContainer" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850185 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} err="failed to get container status \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": rpc error: code = NotFound desc = could not find container \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": container with ID starting with b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850200 4744 scope.go:117] "RemoveContainer" containerID="35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850338 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478"} err="failed to get container status \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": rpc error: code = NotFound desc = could not find container \"35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478\": container with ID starting with 35504fdcdfb50e70bdbe7605a68768479239c5ac31339d8da26ce06db0109478 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850352 4744 scope.go:117] "RemoveContainer" containerID="a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850482 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d"} err="failed to get container status \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": rpc error: code = NotFound desc = could not find container \"a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d\": container with ID starting with a1d0d9e392c18f0e1f7a8fb2a2ca1229ec6bdde55933a18709ac9f5dfc14797d not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850498 4744 scope.go:117] "RemoveContainer" containerID="c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850707 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70"} err="failed to get container status \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": rpc error: code = NotFound desc = could not find container \"c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70\": container with ID starting with c721d31662de7199ff6a4551c3f0cf6e79e3b15aaf72c0cb6cc33a2952d19f70 not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.850730 4744 scope.go:117] "RemoveContainer" containerID="b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.851004 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b"} err="failed to get container status \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": rpc error: code = NotFound desc = could not find container \"b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b\": container with ID starting with b26c41f5dfa25d3915bd43a8d3955b043a4c2b5a5ea7a98451ab36111298909b not found: ID does not exist" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.859747 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.906845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935113 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935140 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935149 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02d576b4-3837-45a7-ae82-8790466a57e5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935157 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935166 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xswsv\" (UniqueName: \"kubernetes.io/projected/02d576b4-3837-45a7-ae82-8790466a57e5-kube-api-access-xswsv\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.935175 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.953622 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data" (OuterVolumeSpecName: "config-data") pod "02d576b4-3837-45a7-ae82-8790466a57e5" (UID: "02d576b4-3837-45a7-ae82-8790466a57e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:16:56 crc kubenswrapper[4744]: I0311 01:16:56.998866 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.037213 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d576b4-3837-45a7-ae82-8790466a57e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.242468 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.254044 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273038 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:57 crc kubenswrapper[4744]: E0311 01:16:57.273560 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="proxy-httpd" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273585 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="proxy-httpd" Mar 11 01:16:57 crc kubenswrapper[4744]: E0311 01:16:57.273619 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-central-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273628 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-central-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: E0311 01:16:57.273638 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="sg-core" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273643 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="sg-core" Mar 11 01:16:57 crc kubenswrapper[4744]: E0311 01:16:57.273655 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-notification-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273660 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-notification-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273845 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-central-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273869 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="proxy-httpd" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273880 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="sg-core" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.273887 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" containerName="ceilometer-notification-agent" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.284004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.287690 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.287712 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.325195 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352042 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352134 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352165 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352220 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.352315 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqf2q\" (UniqueName: \"kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.353738 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455311 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455396 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455458 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqf2q\" (UniqueName: \"kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455483 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.455507 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.456644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.456993 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.458726 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.462087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.465078 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.466040 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.475180 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqf2q\" (UniqueName: \"kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q\") pod \"ceilometer-0\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.628718 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.754461 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerStarted","Data":"9907e92d1fac188fb7c44fa1681a08073a561732eeaba685269da05b1fbc9c68"} Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.754503 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerStarted","Data":"915179a7516d9702ba0c2f51a83fdc1a69ce876edb13d10542894dba2430fe71"} Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.754524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerStarted","Data":"286eb384cb18b19ed1345645d29d8040537d31182d97875d0d7b23e2df75caa2"} Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.754577 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.754619 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.796298 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" podStartSLOduration=1.796279454 podStartE2EDuration="1.796279454s" podCreationTimestamp="2026-03-11 01:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:16:57.777670359 +0000 UTC m=+1374.581887964" watchObservedRunningTime="2026-03-11 01:16:57.796279454 +0000 UTC m=+1374.600497059" Mar 11 01:16:57 crc kubenswrapper[4744]: I0311 01:16:57.985067 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d576b4-3837-45a7-ae82-8790466a57e5" path="/var/lib/kubelet/pods/02d576b4-3837-45a7-ae82-8790466a57e5/volumes" Mar 11 01:16:58 crc kubenswrapper[4744]: I0311 01:16:58.090638 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:16:58 crc kubenswrapper[4744]: I0311 01:16:58.774682 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerStarted","Data":"cd1ea44b6c39d35438c572552d09db104607ad5f269a49f607a76ab2e7649bf3"} Mar 11 01:17:00 crc kubenswrapper[4744]: I0311 01:17:00.209365 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.033477 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.855467 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-45rmx"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.856832 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.870747 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-45rmx"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.956878 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rrtn9"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.958346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.968600 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rrtn9"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.974044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7s8\" (UniqueName: \"kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.974101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.993570 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-95dc-account-create-update-q7hx6"] Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.994707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:02 crc kubenswrapper[4744]: I0311 01:17:02.998022 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.000612 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-95dc-account-create-update-q7hx6"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.057461 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e282-account-create-update-cq8mf"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.058540 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.060445 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.070069 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e282-account-create-update-cq8mf"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.079614 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.079836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wp4p\" (UniqueName: \"kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.079880 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9xm\" (UniqueName: \"kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.079905 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.079937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7s8\" (UniqueName: \"kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.080017 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.080654 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.115302 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7s8\" (UniqueName: \"kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8\") pod \"nova-api-db-create-45rmx\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.162064 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k9frb"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.163108 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.174913 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.179938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k9frb"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180784 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6b2\" (UniqueName: \"kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180839 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wp4p\" (UniqueName: \"kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180884 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9xm\" (UniqueName: \"kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180904 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.180955 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.200683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.203983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9xm\" (UniqueName: \"kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm\") pod \"nova-api-95dc-account-create-update-q7hx6\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.250086 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.252438 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wp4p\" (UniqueName: \"kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p\") pod \"nova-cell0-db-create-rrtn9\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.277391 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.283215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.283287 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6b2\" (UniqueName: \"kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.283452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98mb\" (UniqueName: \"kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.283559 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.284301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.300100 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6b2\" (UniqueName: \"kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2\") pod \"nova-cell0-e282-account-create-update-cq8mf\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.308137 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.373167 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-876b-account-create-update-srkst"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.374238 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.379485 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.385298 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98mb\" (UniqueName: \"kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.385416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.386322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.387426 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.390596 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-876b-account-create-update-srkst"] Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.404975 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98mb\" (UniqueName: \"kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb\") pod \"nova-cell1-db-create-k9frb\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.480387 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.487436 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.487740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzcn\" (UniqueName: \"kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.589532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.589645 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzcn\" (UniqueName: \"kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.590627 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.608443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzcn\" (UniqueName: \"kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn\") pod \"nova-cell1-876b-account-create-update-srkst\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:03 crc kubenswrapper[4744]: I0311 01:17:03.688945 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.239763 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rrtn9"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.444876 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e282-account-create-update-cq8mf"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.455155 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k9frb"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.465757 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-45rmx"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.525196 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.630644 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-95dc-account-create-update-q7hx6"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.655007 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-876b-account-create-update-srkst"] Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.674091 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.685104 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.841350 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45rmx" event={"ID":"b7669ed4-670c-4a2c-8ce2-b69579c30e15","Type":"ContainerStarted","Data":"ebdc71e1a72435c8db752c01353cd2455830979afbd2fb878bea63ca113a125b"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.841407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45rmx" event={"ID":"b7669ed4-670c-4a2c-8ce2-b69579c30e15","Type":"ContainerStarted","Data":"c87108ce29a77e5b72299ac773e4d29f9c94eb1ff616793a9159a66c2e6d7dba"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.845265 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-95dc-account-create-update-q7hx6" event={"ID":"fbe2524a-0c58-4294-b9e2-640fe0b5d294","Type":"ContainerStarted","Data":"20a01619818e7dbd626af61d1d21081fef387b9758203492db97c5d65bbd5004"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.847789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-876b-account-create-update-srkst" event={"ID":"526978e0-6809-4f61-863a-c7c1c54a7507","Type":"ContainerStarted","Data":"306141ba6621f5ddc67297b0a9deeb00ad75cfa4cb3f97e794e435f1ff75f23d"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.849304 4744 generic.go:334] "Generic (PLEG): container finished" podID="e530fb58-e471-4681-8602-6218f09b0c04" containerID="8cb2fb1eaf36c371a675c0619e8832d79d08d8d675c411d59203012135eb70ca" exitCode=0 Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.849374 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rrtn9" event={"ID":"e530fb58-e471-4681-8602-6218f09b0c04","Type":"ContainerDied","Data":"8cb2fb1eaf36c371a675c0619e8832d79d08d8d675c411d59203012135eb70ca"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.849403 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rrtn9" event={"ID":"e530fb58-e471-4681-8602-6218f09b0c04","Type":"ContainerStarted","Data":"0eb1b0a4e44ed6a148499dcf33e9cc32a37f712cbfa7e38f80c1f9a3ed3e1ef1"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.852464 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" event={"ID":"3a3ecb10-c01d-44af-9dc1-18a83c479f37","Type":"ContainerStarted","Data":"69482b178fafc05c024eb004a115beaab033032799b5cadbee23b74dfb43962a"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.852519 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" event={"ID":"3a3ecb10-c01d-44af-9dc1-18a83c479f37","Type":"ContainerStarted","Data":"9eb208b7f476a0eec60f40a29b0aee8270ee7fd5d267941c738152c3cf9da7d7"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.857908 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerStarted","Data":"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.858721 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-45rmx" podStartSLOduration=2.858681865 podStartE2EDuration="2.858681865s" podCreationTimestamp="2026-03-11 01:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:04.856627962 +0000 UTC m=+1381.660845567" watchObservedRunningTime="2026-03-11 01:17:04.858681865 +0000 UTC m=+1381.662899470" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.860523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"35ec702b-4aa6-4fa6-a770-ec3caf762d5f","Type":"ContainerStarted","Data":"fa97b985b022afa1a47a024d8d1193a6165049de48146766011f761fe39d7dce"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.865852 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9frb" event={"ID":"958fb132-39c9-4989-a322-8691247f7b22","Type":"ContainerStarted","Data":"9142db13bbc5f82956a74594d528d0085a37cf1bf44730e738c589a0b9fa71b1"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.866061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9frb" event={"ID":"958fb132-39c9-4989-a322-8691247f7b22","Type":"ContainerStarted","Data":"fa4d51a1f79a51bf958eb80de3d27b0a648d0d1e3f1d2410ba3983ea972ecadc"} Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.911411 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" podStartSLOduration=1.911393726 podStartE2EDuration="1.911393726s" podCreationTimestamp="2026-03-11 01:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:04.885144024 +0000 UTC m=+1381.689361649" watchObservedRunningTime="2026-03-11 01:17:04.911393726 +0000 UTC m=+1381.715611331" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.924728 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.806209453 podStartE2EDuration="12.924711467s" podCreationTimestamp="2026-03-11 01:16:52 +0000 UTC" firstStartedPulling="2026-03-11 01:16:53.698209818 +0000 UTC m=+1370.502427423" lastFinishedPulling="2026-03-11 01:17:03.816711832 +0000 UTC m=+1380.620929437" observedRunningTime="2026-03-11 01:17:04.905047859 +0000 UTC m=+1381.709265464" watchObservedRunningTime="2026-03-11 01:17:04.924711467 +0000 UTC m=+1381.728929062" Mar 11 01:17:04 crc kubenswrapper[4744]: I0311 01:17:04.933578 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-k9frb" podStartSLOduration=1.933564301 podStartE2EDuration="1.933564301s" podCreationTimestamp="2026-03-11 01:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:04.91449173 +0000 UTC m=+1381.718709335" watchObservedRunningTime="2026-03-11 01:17:04.933564301 +0000 UTC m=+1381.737781906" Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.878954 4744 generic.go:334] "Generic (PLEG): container finished" podID="3a3ecb10-c01d-44af-9dc1-18a83c479f37" containerID="69482b178fafc05c024eb004a115beaab033032799b5cadbee23b74dfb43962a" exitCode=0 Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.879438 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" event={"ID":"3a3ecb10-c01d-44af-9dc1-18a83c479f37","Type":"ContainerDied","Data":"69482b178fafc05c024eb004a115beaab033032799b5cadbee23b74dfb43962a"} Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.893470 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerStarted","Data":"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76"} Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.898903 4744 generic.go:334] "Generic (PLEG): container finished" podID="958fb132-39c9-4989-a322-8691247f7b22" containerID="9142db13bbc5f82956a74594d528d0085a37cf1bf44730e738c589a0b9fa71b1" exitCode=0 Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.898992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9frb" event={"ID":"958fb132-39c9-4989-a322-8691247f7b22","Type":"ContainerDied","Data":"9142db13bbc5f82956a74594d528d0085a37cf1bf44730e738c589a0b9fa71b1"} Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.903270 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7669ed4-670c-4a2c-8ce2-b69579c30e15" containerID="ebdc71e1a72435c8db752c01353cd2455830979afbd2fb878bea63ca113a125b" exitCode=0 Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.903325 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45rmx" event={"ID":"b7669ed4-670c-4a2c-8ce2-b69579c30e15","Type":"ContainerDied","Data":"ebdc71e1a72435c8db752c01353cd2455830979afbd2fb878bea63ca113a125b"} Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.904271 4744 generic.go:334] "Generic (PLEG): container finished" podID="fbe2524a-0c58-4294-b9e2-640fe0b5d294" containerID="f8b8f7b449a25530a4e56bdbe6a657c349409b4ba50248bbde453d067a3c546c" exitCode=0 Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.904307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-95dc-account-create-update-q7hx6" event={"ID":"fbe2524a-0c58-4294-b9e2-640fe0b5d294","Type":"ContainerDied","Data":"f8b8f7b449a25530a4e56bdbe6a657c349409b4ba50248bbde453d067a3c546c"} Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.905435 4744 generic.go:334] "Generic (PLEG): container finished" podID="526978e0-6809-4f61-863a-c7c1c54a7507" containerID="66c68ed9f5fec4903698aade9ec4c3fbc9295c3abb2ec1eb2b9ad1cedcf3b073" exitCode=0 Mar 11 01:17:05 crc kubenswrapper[4744]: I0311 01:17:05.905485 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-876b-account-create-update-srkst" event={"ID":"526978e0-6809-4f61-863a-c7c1c54a7507","Type":"ContainerDied","Data":"66c68ed9f5fec4903698aade9ec4c3fbc9295c3abb2ec1eb2b9ad1cedcf3b073"} Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.271167 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.359265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts\") pod \"e530fb58-e471-4681-8602-6218f09b0c04\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.359424 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wp4p\" (UniqueName: \"kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p\") pod \"e530fb58-e471-4681-8602-6218f09b0c04\" (UID: \"e530fb58-e471-4681-8602-6218f09b0c04\") " Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.360306 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e530fb58-e471-4681-8602-6218f09b0c04" (UID: "e530fb58-e471-4681-8602-6218f09b0c04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.365210 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p" (OuterVolumeSpecName: "kube-api-access-9wp4p") pod "e530fb58-e471-4681-8602-6218f09b0c04" (UID: "e530fb58-e471-4681-8602-6218f09b0c04"). InnerVolumeSpecName "kube-api-access-9wp4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.462785 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wp4p\" (UniqueName: \"kubernetes.io/projected/e530fb58-e471-4681-8602-6218f09b0c04-kube-api-access-9wp4p\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.462814 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e530fb58-e471-4681-8602-6218f09b0c04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.463464 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.464983 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.935785 4744 scope.go:117] "RemoveContainer" containerID="eb0d337444e47aa6e31605c97ea9af0ca01d4b30374bc3a640200fbe0eae7572" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.961596 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rrtn9" event={"ID":"e530fb58-e471-4681-8602-6218f09b0c04","Type":"ContainerDied","Data":"0eb1b0a4e44ed6a148499dcf33e9cc32a37f712cbfa7e38f80c1f9a3ed3e1ef1"} Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.962187 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb1b0a4e44ed6a148499dcf33e9cc32a37f712cbfa7e38f80c1f9a3ed3e1ef1" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.961896 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rrtn9" Mar 11 01:17:06 crc kubenswrapper[4744]: I0311 01:17:06.976008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerStarted","Data":"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6"} Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.488810 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.647072 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts\") pod \"526978e0-6809-4f61-863a-c7c1c54a7507\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.647171 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzcn\" (UniqueName: \"kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn\") pod \"526978e0-6809-4f61-863a-c7c1c54a7507\" (UID: \"526978e0-6809-4f61-863a-c7c1c54a7507\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.649048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "526978e0-6809-4f61-863a-c7c1c54a7507" (UID: "526978e0-6809-4f61-863a-c7c1c54a7507"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.653428 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn" (OuterVolumeSpecName: "kube-api-access-gmzcn") pod "526978e0-6809-4f61-863a-c7c1c54a7507" (UID: "526978e0-6809-4f61-863a-c7c1c54a7507"). InnerVolumeSpecName "kube-api-access-gmzcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.686482 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.692076 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.712686 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.730871 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.749157 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzcn\" (UniqueName: \"kubernetes.io/projected/526978e0-6809-4f61-863a-c7c1c54a7507-kube-api-access-gmzcn\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.749184 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526978e0-6809-4f61-863a-c7c1c54a7507-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849842 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts\") pod \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849890 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts\") pod \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849920 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j6b2\" (UniqueName: \"kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2\") pod \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\" (UID: \"3a3ecb10-c01d-44af-9dc1-18a83c479f37\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849942 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt9xm\" (UniqueName: \"kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm\") pod \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\" (UID: \"fbe2524a-0c58-4294-b9e2-640fe0b5d294\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849963 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts\") pod \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.849998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg7s8\" (UniqueName: \"kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8\") pod \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\" (UID: \"b7669ed4-670c-4a2c-8ce2-b69579c30e15\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts\") pod \"958fb132-39c9-4989-a322-8691247f7b22\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850073 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n98mb\" (UniqueName: \"kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb\") pod \"958fb132-39c9-4989-a322-8691247f7b22\" (UID: \"958fb132-39c9-4989-a322-8691247f7b22\") " Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850235 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbe2524a-0c58-4294-b9e2-640fe0b5d294" (UID: "fbe2524a-0c58-4294-b9e2-640fe0b5d294"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850322 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a3ecb10-c01d-44af-9dc1-18a83c479f37" (UID: "3a3ecb10-c01d-44af-9dc1-18a83c479f37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850472 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe2524a-0c58-4294-b9e2-640fe0b5d294-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.850484 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a3ecb10-c01d-44af-9dc1-18a83c479f37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.851379 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "958fb132-39c9-4989-a322-8691247f7b22" (UID: "958fb132-39c9-4989-a322-8691247f7b22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.851728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7669ed4-670c-4a2c-8ce2-b69579c30e15" (UID: "b7669ed4-670c-4a2c-8ce2-b69579c30e15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.854105 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm" (OuterVolumeSpecName: "kube-api-access-zt9xm") pod "fbe2524a-0c58-4294-b9e2-640fe0b5d294" (UID: "fbe2524a-0c58-4294-b9e2-640fe0b5d294"). InnerVolumeSpecName "kube-api-access-zt9xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.854452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2" (OuterVolumeSpecName: "kube-api-access-7j6b2") pod "3a3ecb10-c01d-44af-9dc1-18a83c479f37" (UID: "3a3ecb10-c01d-44af-9dc1-18a83c479f37"). InnerVolumeSpecName "kube-api-access-7j6b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.854463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8" (OuterVolumeSpecName: "kube-api-access-lg7s8") pod "b7669ed4-670c-4a2c-8ce2-b69579c30e15" (UID: "b7669ed4-670c-4a2c-8ce2-b69579c30e15"). InnerVolumeSpecName "kube-api-access-lg7s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.855727 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb" (OuterVolumeSpecName: "kube-api-access-n98mb") pod "958fb132-39c9-4989-a322-8691247f7b22" (UID: "958fb132-39c9-4989-a322-8691247f7b22"). InnerVolumeSpecName "kube-api-access-n98mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952432 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg7s8\" (UniqueName: \"kubernetes.io/projected/b7669ed4-670c-4a2c-8ce2-b69579c30e15-kube-api-access-lg7s8\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952465 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958fb132-39c9-4989-a322-8691247f7b22-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952474 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n98mb\" (UniqueName: \"kubernetes.io/projected/958fb132-39c9-4989-a322-8691247f7b22-kube-api-access-n98mb\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952483 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j6b2\" (UniqueName: \"kubernetes.io/projected/3a3ecb10-c01d-44af-9dc1-18a83c479f37-kube-api-access-7j6b2\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952492 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt9xm\" (UniqueName: \"kubernetes.io/projected/fbe2524a-0c58-4294-b9e2-640fe0b5d294-kube-api-access-zt9xm\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:07 crc kubenswrapper[4744]: I0311 01:17:07.952501 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7669ed4-670c-4a2c-8ce2-b69579c30e15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.004264 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9frb" event={"ID":"958fb132-39c9-4989-a322-8691247f7b22","Type":"ContainerDied","Data":"fa4d51a1f79a51bf958eb80de3d27b0a648d0d1e3f1d2410ba3983ea972ecadc"} Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.004311 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa4d51a1f79a51bf958eb80de3d27b0a648d0d1e3f1d2410ba3983ea972ecadc" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.005995 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-45rmx" event={"ID":"b7669ed4-670c-4a2c-8ce2-b69579c30e15","Type":"ContainerDied","Data":"c87108ce29a77e5b72299ac773e4d29f9c94eb1ff616793a9159a66c2e6d7dba"} Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.006013 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87108ce29a77e5b72299ac773e4d29f9c94eb1ff616793a9159a66c2e6d7dba" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.006066 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-45rmx" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.010005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-876b-account-create-update-srkst" event={"ID":"526978e0-6809-4f61-863a-c7c1c54a7507","Type":"ContainerDied","Data":"306141ba6621f5ddc67297b0a9deeb00ad75cfa4cb3f97e794e435f1ff75f23d"} Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.010026 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-876b-account-create-update-srkst" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.010030 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306141ba6621f5ddc67297b0a9deeb00ad75cfa4cb3f97e794e435f1ff75f23d" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.012300 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9frb" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.013387 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" event={"ID":"3a3ecb10-c01d-44af-9dc1-18a83c479f37","Type":"ContainerDied","Data":"9eb208b7f476a0eec60f40a29b0aee8270ee7fd5d267941c738152c3cf9da7d7"} Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.013438 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb208b7f476a0eec60f40a29b0aee8270ee7fd5d267941c738152c3cf9da7d7" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.013503 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e282-account-create-update-cq8mf" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.020256 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-95dc-account-create-update-q7hx6" event={"ID":"fbe2524a-0c58-4294-b9e2-640fe0b5d294","Type":"ContainerDied","Data":"20a01619818e7dbd626af61d1d21081fef387b9758203492db97c5d65bbd5004"} Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.020286 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-q7hx6" Mar 11 01:17:08 crc kubenswrapper[4744]: I0311 01:17:08.020294 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a01619818e7dbd626af61d1d21081fef387b9758203492db97c5d65bbd5004" Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.030409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerStarted","Data":"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6"} Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.030855 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.030859 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-central-agent" containerID="cri-o://1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7" gracePeriod=30 Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.031021 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="proxy-httpd" containerID="cri-o://f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6" gracePeriod=30 Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.031086 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="sg-core" containerID="cri-o://32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6" gracePeriod=30 Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.031147 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-notification-agent" containerID="cri-o://c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76" gracePeriod=30 Mar 11 01:17:09 crc kubenswrapper[4744]: I0311 01:17:09.061619 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.455268694 podStartE2EDuration="12.061597503s" podCreationTimestamp="2026-03-11 01:16:57 +0000 UTC" firstStartedPulling="2026-03-11 01:16:58.102883033 +0000 UTC m=+1374.907100638" lastFinishedPulling="2026-03-11 01:17:07.709211852 +0000 UTC m=+1384.513429447" observedRunningTime="2026-03-11 01:17:09.050135118 +0000 UTC m=+1385.854352723" watchObservedRunningTime="2026-03-11 01:17:09.061597503 +0000 UTC m=+1385.865815128" Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.043935 4744 generic.go:334] "Generic (PLEG): container finished" podID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerID="f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6" exitCode=0 Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.044578 4744 generic.go:334] "Generic (PLEG): container finished" podID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerID="32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6" exitCode=2 Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.044649 4744 generic.go:334] "Generic (PLEG): container finished" podID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerID="c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76" exitCode=0 Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.043971 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerDied","Data":"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6"} Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.044802 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerDied","Data":"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6"} Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.044862 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerDied","Data":"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76"} Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.133054 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.217439 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.217992 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5568fddbb8-2fn4w" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-httpd" containerID="cri-o://1a1f227b90e4b8581f9aee4646f1053a56e9395e2bb663593d46ed7803fc986d" gracePeriod=30 Mar 11 01:17:10 crc kubenswrapper[4744]: I0311 01:17:10.218289 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5568fddbb8-2fn4w" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-api" containerID="cri-o://7b8d25281984fbd8762e523200998295e6a8e82c51a73cb0a34304444754f847" gracePeriod=30 Mar 11 01:17:11 crc kubenswrapper[4744]: I0311 01:17:11.054178 4744 generic.go:334] "Generic (PLEG): container finished" podID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerID="1a1f227b90e4b8581f9aee4646f1053a56e9395e2bb663593d46ed7803fc986d" exitCode=0 Mar 11 01:17:11 crc kubenswrapper[4744]: I0311 01:17:11.054239 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerDied","Data":"1a1f227b90e4b8581f9aee4646f1053a56e9395e2bb663593d46ed7803fc986d"} Mar 11 01:17:12 crc kubenswrapper[4744]: I0311 01:17:12.409144 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:17:12 crc kubenswrapper[4744]: I0311 01:17:12.409414 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:17:12 crc kubenswrapper[4744]: I0311 01:17:12.409454 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:17:12 crc kubenswrapper[4744]: I0311 01:17:12.410437 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:17:12 crc kubenswrapper[4744]: I0311 01:17:12.410495 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436" gracePeriod=600 Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.083096 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436" exitCode=0 Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.083560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436"} Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.087065 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b"} Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.087111 4744 scope.go:117] "RemoveContainer" containerID="9f82ae9e65d034b974cecba295d1e92bb34ed10ce5e057ec718abcef76965433" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.285477 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-knwz7"] Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286129 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe2524a-0c58-4294-b9e2-640fe0b5d294" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286147 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe2524a-0c58-4294-b9e2-640fe0b5d294" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286158 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3ecb10-c01d-44af-9dc1-18a83c479f37" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286165 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3ecb10-c01d-44af-9dc1-18a83c479f37" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286182 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526978e0-6809-4f61-863a-c7c1c54a7507" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286188 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="526978e0-6809-4f61-863a-c7c1c54a7507" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286210 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e530fb58-e471-4681-8602-6218f09b0c04" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286217 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e530fb58-e471-4681-8602-6218f09b0c04" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286230 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7669ed4-670c-4a2c-8ce2-b69579c30e15" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286252 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7669ed4-670c-4a2c-8ce2-b69579c30e15" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: E0311 01:17:13.286260 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958fb132-39c9-4989-a322-8691247f7b22" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286266 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="958fb132-39c9-4989-a322-8691247f7b22" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286415 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e530fb58-e471-4681-8602-6218f09b0c04" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286428 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3ecb10-c01d-44af-9dc1-18a83c479f37" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286444 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7669ed4-670c-4a2c-8ce2-b69579c30e15" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286454 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="526978e0-6809-4f61-863a-c7c1c54a7507" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286463 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe2524a-0c58-4294-b9e2-640fe0b5d294" containerName="mariadb-account-create-update" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.286473 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="958fb132-39c9-4989-a322-8691247f7b22" containerName="mariadb-database-create" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.287011 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.288877 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.289450 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.290020 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g7bsd" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.301769 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-knwz7"] Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.464155 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.464231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.464297 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.464321 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2vw\" (UniqueName: \"kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.565536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.565581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2vw\" (UniqueName: \"kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.565645 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.565691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.572376 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.572835 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.575170 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.591157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2vw\" (UniqueName: \"kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw\") pod \"nova-cell0-conductor-db-sync-knwz7\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:13 crc kubenswrapper[4744]: I0311 01:17:13.602932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.120879 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.123655 4744 generic.go:334] "Generic (PLEG): container finished" podID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerID="7b8d25281984fbd8762e523200998295e6a8e82c51a73cb0a34304444754f847" exitCode=0 Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.123690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerDied","Data":"7b8d25281984fbd8762e523200998295e6a8e82c51a73cb0a34304444754f847"} Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.143988 4744 generic.go:334] "Generic (PLEG): container finished" podID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerID="1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7" exitCode=0 Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.144030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerDied","Data":"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7"} Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.144058 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d53314a9-6f4b-45ad-9715-01d2d0ae2e63","Type":"ContainerDied","Data":"cd1ea44b6c39d35438c572552d09db104607ad5f269a49f607a76ab2e7649bf3"} Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.144077 4744 scope.go:117] "RemoveContainer" containerID="f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.144223 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177024 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177139 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177187 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177214 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177347 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177384 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.177418 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqf2q\" (UniqueName: \"kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q\") pod \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\" (UID: \"d53314a9-6f4b-45ad-9715-01d2d0ae2e63\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.180069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.185935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts" (OuterVolumeSpecName: "scripts") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.186208 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.187236 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.187260 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.187269 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.200882 4744 scope.go:117] "RemoveContainer" containerID="32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.215635 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q" (OuterVolumeSpecName: "kube-api-access-xqf2q") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "kube-api-access-xqf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: W0311 01:17:14.221169 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc263c020_5938_4b77_b265_c297ae87f084.slice/crio-eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405 WatchSource:0}: Error finding container eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405: Status 404 returned error can't find the container with id eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405 Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.223209 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-knwz7"] Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.235025 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.242453 4744 scope.go:117] "RemoveContainer" containerID="c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.265747 4744 scope.go:117] "RemoveContainer" containerID="1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.279167 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.287180 4744 scope.go:117] "RemoveContainer" containerID="f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.287621 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6\": container with ID starting with f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6 not found: ID does not exist" containerID="f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.287646 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6"} err="failed to get container status \"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6\": rpc error: code = NotFound desc = could not find container \"f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6\": container with ID starting with f4b1fe43db3071c65f379b53716d74df59186093a7ee23025d4e2e33962582b6 not found: ID does not exist" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.287665 4744 scope.go:117] "RemoveContainer" containerID="32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.287957 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6\": container with ID starting with 32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6 not found: ID does not exist" containerID="32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.287971 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6"} err="failed to get container status \"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6\": rpc error: code = NotFound desc = could not find container \"32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6\": container with ID starting with 32c0d779479b088f9c612d9b3507ed25a3fc8acb9dcda61c8bf9d31655ca6cf6 not found: ID does not exist" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.287982 4744 scope.go:117] "RemoveContainer" containerID="c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.289779 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.290371 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.290473 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqf2q\" (UniqueName: \"kubernetes.io/projected/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-kube-api-access-xqf2q\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.290503 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76\": container with ID starting with c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76 not found: ID does not exist" containerID="c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.290626 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76"} err="failed to get container status \"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76\": rpc error: code = NotFound desc = could not find container \"c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76\": container with ID starting with c95f1358c6f242ba93ec6e3c9bf5157f8ade313f8633ecb8b9af1f284d336a76 not found: ID does not exist" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.290710 4744 scope.go:117] "RemoveContainer" containerID="1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.291281 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7\": container with ID starting with 1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7 not found: ID does not exist" containerID="1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.291309 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7"} err="failed to get container status \"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7\": rpc error: code = NotFound desc = could not find container \"1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7\": container with ID starting with 1b99a11e0412ef61fcdcf759bfdf4436873046edae4487c1e4528846a4da27e7 not found: ID does not exist" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.297273 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data" (OuterVolumeSpecName: "config-data") pod "d53314a9-6f4b-45ad-9715-01d2d0ae2e63" (UID: "d53314a9-6f4b-45ad-9715-01d2d0ae2e63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.370926 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.395329 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d53314a9-6f4b-45ad-9715-01d2d0ae2e63-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.451751 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.494462 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.520184 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532370 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532728 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="proxy-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532749 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="proxy-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532767 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="sg-core" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532773 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="sg-core" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532785 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532791 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532802 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-central-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532809 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-central-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532819 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-api" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532825 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-api" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.532840 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-notification-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.532846 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-notification-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533005 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-central-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533016 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="proxy-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533024 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-api" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533039 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" containerName="neutron-httpd" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533047 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="sg-core" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.533059 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" containerName="ceilometer-notification-agent" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.534723 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.547959 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.547982 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.572429 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.599385 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czhps\" (UniqueName: \"kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps\") pod \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.599478 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle\") pod \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.599584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config\") pod \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.599653 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs\") pod \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.599701 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config\") pod \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\" (UID: \"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9\") " Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.604084 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps" (OuterVolumeSpecName: "kube-api-access-czhps") pod "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" (UID: "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9"). InnerVolumeSpecName "kube-api-access-czhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.633281 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" (UID: "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.673600 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" (UID: "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.688952 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.694592 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config" (OuterVolumeSpecName: "config") pod "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" (UID: "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.696364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" (UID: "7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701363 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkkth\" (UniqueName: \"kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701405 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701502 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701727 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701823 4744 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701833 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701843 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czhps\" (UniqueName: \"kubernetes.io/projected/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-kube-api-access-czhps\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701854 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.701861 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.743956 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.744185 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55fb7645db-dh4kb" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-log" containerID="cri-o://8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a" gracePeriod=30 Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.744601 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55fb7645db-dh4kb" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-api" containerID="cri-o://0a4949e41ac540c50bbdd22a80ed256ec6d9d7caf5e686eda3841b02bb821d2c" gracePeriod=30 Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.803154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.803414 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.803579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.804013 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.804384 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkkth\" (UniqueName: \"kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.804489 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.804439 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.805844 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.805940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.809638 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.809967 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.815427 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.817458 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: I0311 01:17:14.822666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkkth\" (UniqueName: \"kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth\") pod \"ceilometer-0\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " pod="openstack/ceilometer-0" Mar 11 01:17:14 crc kubenswrapper[4744]: E0311 01:17:14.877166 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d40e511_126b_428d_aad8_c7c6ca90ec9a.slice/crio-8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d40e511_126b_428d_aad8_c7c6ca90ec9a.slice/crio-conmon-8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a.scope\": RecentStats: unable to find data in memory cache]" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.079432 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.171422 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568fddbb8-2fn4w" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.171417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568fddbb8-2fn4w" event={"ID":"7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9","Type":"ContainerDied","Data":"f724975883cb9d897dce411c8f17e1c7380b82aa102e103367e27d2bea058e3a"} Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.171914 4744 scope.go:117] "RemoveContainer" containerID="1a1f227b90e4b8581f9aee4646f1053a56e9395e2bb663593d46ed7803fc986d" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.183643 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerID="8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a" exitCode=143 Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.183798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerDied","Data":"8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a"} Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.188372 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-knwz7" event={"ID":"c263c020-5938-4b77-b265-c297ae87f084","Type":"ContainerStarted","Data":"eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405"} Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.221903 4744 scope.go:117] "RemoveContainer" containerID="7b8d25281984fbd8762e523200998295e6a8e82c51a73cb0a34304444754f847" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.225946 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.250789 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5568fddbb8-2fn4w"] Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.540310 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.984378 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9" path="/var/lib/kubelet/pods/7633b6fe-9d97-4a1e-b1fb-6cf2e02ad1c9/volumes" Mar 11 01:17:15 crc kubenswrapper[4744]: I0311 01:17:15.985041 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53314a9-6f4b-45ad-9715-01d2d0ae2e63" path="/var/lib/kubelet/pods/d53314a9-6f4b-45ad-9715-01d2d0ae2e63/volumes" Mar 11 01:17:16 crc kubenswrapper[4744]: I0311 01:17:16.200827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerStarted","Data":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} Mar 11 01:17:16 crc kubenswrapper[4744]: I0311 01:17:16.201185 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerStarted","Data":"9ae143481043db77511c1035185e126af12850fc27f1d91ee5749ea1c137aa1c"} Mar 11 01:17:17 crc kubenswrapper[4744]: I0311 01:17:17.213800 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerStarted","Data":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} Mar 11 01:17:18 crc kubenswrapper[4744]: I0311 01:17:18.225996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerDied","Data":"0a4949e41ac540c50bbdd22a80ed256ec6d9d7caf5e686eda3841b02bb821d2c"} Mar 11 01:17:18 crc kubenswrapper[4744]: I0311 01:17:18.225896 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerID="0a4949e41ac540c50bbdd22a80ed256ec6d9d7caf5e686eda3841b02bb821d2c" exitCode=0 Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.264336 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.300766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55fb7645db-dh4kb" event={"ID":"5d40e511-126b-428d-aad8-c7c6ca90ec9a","Type":"ContainerDied","Data":"52433837c70b6c098da56e50528d55dbb5db2a0c92a88f0a940dfe0f071bd418"} Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.300820 4744 scope.go:117] "RemoveContainer" containerID="0a4949e41ac540c50bbdd22a80ed256ec6d9d7caf5e686eda3841b02bb821d2c" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.300999 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55fb7645db-dh4kb" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.305987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-knwz7" event={"ID":"c263c020-5938-4b77-b265-c297ae87f084","Type":"ContainerStarted","Data":"5609d95b861d80c9c856a5d5ae721e3df23258d3dac31b4a8d829352cafed8e3"} Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.319593 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerStarted","Data":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.339053 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-knwz7" podStartSLOduration=1.5851683890000001 podStartE2EDuration="9.339026819s" podCreationTimestamp="2026-03-11 01:17:13 +0000 UTC" firstStartedPulling="2026-03-11 01:17:14.224289503 +0000 UTC m=+1391.028507108" lastFinishedPulling="2026-03-11 01:17:21.978147933 +0000 UTC m=+1398.782365538" observedRunningTime="2026-03-11 01:17:22.332979973 +0000 UTC m=+1399.137197578" watchObservedRunningTime="2026-03-11 01:17:22.339026819 +0000 UTC m=+1399.143244424" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.341826 4744 scope.go:117] "RemoveContainer" containerID="8e2224b4b9b7987e2d126a73f2bc506ebe6f656270eba4312826b4f951a2cc8a" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381135 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpsvr\" (UniqueName: \"kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381411 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381451 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.381477 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts\") pod \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\" (UID: \"5d40e511-126b-428d-aad8-c7c6ca90ec9a\") " Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.382315 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs" (OuterVolumeSpecName: "logs") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.386696 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts" (OuterVolumeSpecName: "scripts") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.387327 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr" (OuterVolumeSpecName: "kube-api-access-cpsvr") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "kube-api-access-cpsvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.430046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data" (OuterVolumeSpecName: "config-data") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.432930 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.480277 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483784 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483815 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483827 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483838 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d40e511-126b-428d-aad8-c7c6ca90ec9a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483848 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.483859 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpsvr\" (UniqueName: \"kubernetes.io/projected/5d40e511-126b-428d-aad8-c7c6ca90ec9a-kube-api-access-cpsvr\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.504839 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d40e511-126b-428d-aad8-c7c6ca90ec9a" (UID: "5d40e511-126b-428d-aad8-c7c6ca90ec9a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.585690 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d40e511-126b-428d-aad8-c7c6ca90ec9a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.642792 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:17:22 crc kubenswrapper[4744]: I0311 01:17:22.650695 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55fb7645db-dh4kb"] Mar 11 01:17:23 crc kubenswrapper[4744]: I0311 01:17:23.556115 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:23 crc kubenswrapper[4744]: I0311 01:17:23.556566 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-log" containerID="cri-o://d81d67c0c0b038d838a1c58d63d77cb43337c0b25c6edd1ee93578a354f525fb" gracePeriod=30 Mar 11 01:17:23 crc kubenswrapper[4744]: I0311 01:17:23.556695 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-httpd" containerID="cri-o://c34020c1c08a3e59f5915d2c9c5efef80c6c178d17557caed339439d1ac2e9f9" gracePeriod=30 Mar 11 01:17:23 crc kubenswrapper[4744]: I0311 01:17:23.985677 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" path="/var/lib/kubelet/pods/5d40e511-126b-428d-aad8-c7c6ca90ec9a/volumes" Mar 11 01:17:24 crc kubenswrapper[4744]: I0311 01:17:24.374749 4744 generic.go:334] "Generic (PLEG): container finished" podID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerID="d81d67c0c0b038d838a1c58d63d77cb43337c0b25c6edd1ee93578a354f525fb" exitCode=143 Mar 11 01:17:24 crc kubenswrapper[4744]: I0311 01:17:24.374815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerDied","Data":"d81d67c0c0b038d838a1c58d63d77cb43337c0b25c6edd1ee93578a354f525fb"} Mar 11 01:17:24 crc kubenswrapper[4744]: I0311 01:17:24.386958 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerStarted","Data":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} Mar 11 01:17:24 crc kubenswrapper[4744]: I0311 01:17:24.387598 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:17:24 crc kubenswrapper[4744]: I0311 01:17:24.413617 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.33244964 podStartE2EDuration="10.413602707s" podCreationTimestamp="2026-03-11 01:17:14 +0000 UTC" firstStartedPulling="2026-03-11 01:17:15.548946697 +0000 UTC m=+1392.353164302" lastFinishedPulling="2026-03-11 01:17:23.630099764 +0000 UTC m=+1400.434317369" observedRunningTime="2026-03-11 01:17:24.410382609 +0000 UTC m=+1401.214600214" watchObservedRunningTime="2026-03-11 01:17:24.413602707 +0000 UTC m=+1401.217820312" Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.052627 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.053055 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-log" containerID="cri-o://8fbde8a2a9b6059d4fbff5c56a5a963290f4efef4a41a866b702d93ee81bb0ff" gracePeriod=30 Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.053157 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-httpd" containerID="cri-o://60cba145747ef116205a3e302ad50bb5d4b00de1456716486451fbb9e79a9eb5" gracePeriod=30 Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.396985 4744 generic.go:334] "Generic (PLEG): container finished" podID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerID="8fbde8a2a9b6059d4fbff5c56a5a963290f4efef4a41a866b702d93ee81bb0ff" exitCode=143 Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.397084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerDied","Data":"8fbde8a2a9b6059d4fbff5c56a5a963290f4efef4a41a866b702d93ee81bb0ff"} Mar 11 01:17:25 crc kubenswrapper[4744]: I0311 01:17:25.800012 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:26 crc kubenswrapper[4744]: I0311 01:17:26.404293 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-central-agent" containerID="cri-o://8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" gracePeriod=30 Mar 11 01:17:26 crc kubenswrapper[4744]: I0311 01:17:26.404345 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="sg-core" containerID="cri-o://a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" gracePeriod=30 Mar 11 01:17:26 crc kubenswrapper[4744]: I0311 01:17:26.404345 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="proxy-httpd" containerID="cri-o://a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" gracePeriod=30 Mar 11 01:17:26 crc kubenswrapper[4744]: I0311 01:17:26.404383 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-notification-agent" containerID="cri-o://25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" gracePeriod=30 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.195816 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369303 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369369 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkkth\" (UniqueName: \"kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369395 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369487 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.369538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts\") pod \"a0e6d46c-3f23-4e10-be5c-483f55d51052\" (UID: \"a0e6d46c-3f23-4e10-be5c-483f55d51052\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.376104 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts" (OuterVolumeSpecName: "scripts") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.376440 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth" (OuterVolumeSpecName: "kube-api-access-bkkth") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "kube-api-access-bkkth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.376580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.376865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.406392 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418490 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" exitCode=0 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418537 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" exitCode=2 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418547 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" exitCode=0 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418553 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" exitCode=0 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418579 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerDied","Data":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418661 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerDied","Data":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerDied","Data":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerDied","Data":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0e6d46c-3f23-4e10-be5c-483f55d51052","Type":"ContainerDied","Data":"9ae143481043db77511c1035185e126af12850fc27f1d91ee5749ea1c137aa1c"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.418715 4744 scope.go:117] "RemoveContainer" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.422325 4744 generic.go:334] "Generic (PLEG): container finished" podID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerID="c34020c1c08a3e59f5915d2c9c5efef80c6c178d17557caed339439d1ac2e9f9" exitCode=0 Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.422376 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerDied","Data":"c34020c1c08a3e59f5915d2c9c5efef80c6c178d17557caed339439d1ac2e9f9"} Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.460703 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471928 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471958 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471970 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471977 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471987 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkkth\" (UniqueName: \"kubernetes.io/projected/a0e6d46c-3f23-4e10-be5c-483f55d51052-kube-api-access-bkkth\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.471997 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0e6d46c-3f23-4e10-be5c-483f55d51052-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.482848 4744 scope.go:117] "RemoveContainer" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.505800 4744 scope.go:117] "RemoveContainer" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.516070 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data" (OuterVolumeSpecName: "config-data") pod "a0e6d46c-3f23-4e10-be5c-483f55d51052" (UID: "a0e6d46c-3f23-4e10-be5c-483f55d51052"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574040 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574360 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbs9b\" (UniqueName: \"kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574455 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574564 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574657 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs" (OuterVolumeSpecName: "logs") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574745 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.574849 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run\") pod \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\" (UID: \"3a87446a-b7cc-4068-91e2-b5dbbc3cda71\") " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.576108 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.576129 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e6d46c-3f23-4e10-be5c-483f55d51052-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.576382 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.578563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b" (OuterVolumeSpecName: "kube-api-access-bbs9b") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "kube-api-access-bbs9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.579831 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.583869 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts" (OuterVolumeSpecName: "scripts") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.607666 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.623388 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data" (OuterVolumeSpecName: "config-data") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.635022 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3a87446a-b7cc-4068-91e2-b5dbbc3cda71" (UID: "3a87446a-b7cc-4068-91e2-b5dbbc3cda71"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677229 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbs9b\" (UniqueName: \"kubernetes.io/projected/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-kube-api-access-bbs9b\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677275 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677287 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677324 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677333 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.677358 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.682987 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a87446a-b7cc-4068-91e2-b5dbbc3cda71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.701691 4744 scope.go:117] "RemoveContainer" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.709684 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.722723 4744 scope.go:117] "RemoveContainer" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.723059 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": container with ID starting with a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91 not found: ID does not exist" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723091 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} err="failed to get container status \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": rpc error: code = NotFound desc = could not find container \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": container with ID starting with a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723112 4744 scope.go:117] "RemoveContainer" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.723283 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": container with ID starting with a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749 not found: ID does not exist" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723305 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} err="failed to get container status \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": rpc error: code = NotFound desc = could not find container \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": container with ID starting with a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723318 4744 scope.go:117] "RemoveContainer" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.723483 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": container with ID starting with 25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207 not found: ID does not exist" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723533 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} err="failed to get container status \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": rpc error: code = NotFound desc = could not find container \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": container with ID starting with 25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723547 4744 scope.go:117] "RemoveContainer" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.723709 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": container with ID starting with 8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a not found: ID does not exist" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723732 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} err="failed to get container status \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": rpc error: code = NotFound desc = could not find container \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": container with ID starting with 8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723744 4744 scope.go:117] "RemoveContainer" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723895 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} err="failed to get container status \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": rpc error: code = NotFound desc = could not find container \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": container with ID starting with a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.723916 4744 scope.go:117] "RemoveContainer" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724066 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} err="failed to get container status \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": rpc error: code = NotFound desc = could not find container \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": container with ID starting with a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724086 4744 scope.go:117] "RemoveContainer" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724233 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} err="failed to get container status \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": rpc error: code = NotFound desc = could not find container \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": container with ID starting with 25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724253 4744 scope.go:117] "RemoveContainer" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724408 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} err="failed to get container status \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": rpc error: code = NotFound desc = could not find container \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": container with ID starting with 8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724427 4744 scope.go:117] "RemoveContainer" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724598 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} err="failed to get container status \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": rpc error: code = NotFound desc = could not find container \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": container with ID starting with a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724618 4744 scope.go:117] "RemoveContainer" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724778 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} err="failed to get container status \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": rpc error: code = NotFound desc = could not find container \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": container with ID starting with a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724796 4744 scope.go:117] "RemoveContainer" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724943 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} err="failed to get container status \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": rpc error: code = NotFound desc = could not find container \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": container with ID starting with 25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.724960 4744 scope.go:117] "RemoveContainer" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725107 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} err="failed to get container status \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": rpc error: code = NotFound desc = could not find container \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": container with ID starting with 8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725121 4744 scope.go:117] "RemoveContainer" containerID="a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725295 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91"} err="failed to get container status \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": rpc error: code = NotFound desc = could not find container \"a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91\": container with ID starting with a23fbb5f49dfca97f34d67d6a0c877c3e7947d50716d539b0e60f3c0afa7bb91 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725314 4744 scope.go:117] "RemoveContainer" containerID="a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725506 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749"} err="failed to get container status \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": rpc error: code = NotFound desc = could not find container \"a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749\": container with ID starting with a38d68ad33aedf913ed04a1387d20eae6d05acc0a7714ef9e0b9bce8a2d21749 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.725545 4744 scope.go:117] "RemoveContainer" containerID="25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.729572 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207"} err="failed to get container status \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": rpc error: code = NotFound desc = could not find container \"25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207\": container with ID starting with 25e9bff23e82866b0ed9e4f81b7991840eca422b9d0c79ef775be2aa1291f207 not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.729599 4744 scope.go:117] "RemoveContainer" containerID="8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.729812 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a"} err="failed to get container status \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": rpc error: code = NotFound desc = could not find container \"8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a\": container with ID starting with 8e32d6eec611441f955f42d3bdeef06c03c1b89490522a9349b78ee112ebe25a not found: ID does not exist" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.729825 4744 scope.go:117] "RemoveContainer" containerID="c34020c1c08a3e59f5915d2c9c5efef80c6c178d17557caed339439d1ac2e9f9" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.764301 4744 scope.go:117] "RemoveContainer" containerID="d81d67c0c0b038d838a1c58d63d77cb43337c0b25c6edd1ee93578a354f525fb" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.766857 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.773152 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.784117 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796284 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796665 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="sg-core" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796687 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="sg-core" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796705 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-api" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796712 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-api" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796728 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-central-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796735 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-central-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796748 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796754 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796763 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-log" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796769 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-log" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796784 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-notification-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796789 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-notification-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796801 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="proxy-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796808 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="proxy-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: E0311 01:17:27.796817 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-log" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796823 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-log" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796969 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796982 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="sg-core" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.796997 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-log" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.797009 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="proxy-httpd" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.797016 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d40e511-126b-428d-aad8-c7c6ca90ec9a" containerName="placement-api" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.797028 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-central-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.797036 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" containerName="ceilometer-notification-agent" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.797045 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" containerName="glance-log" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.798574 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.801888 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.802160 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.818599 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989216 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989545 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.989827 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlvr\" (UniqueName: \"kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.990038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:27 crc kubenswrapper[4744]: I0311 01:17:27.993348 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e6d46c-3f23-4e10-be5c-483f55d51052" path="/var/lib/kubelet/pods/a0e6d46c-3f23-4e10-be5c-483f55d51052/volumes" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091276 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091339 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091410 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091512 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.091558 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlvr\" (UniqueName: \"kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.093265 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.093468 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.097291 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.098136 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.098207 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.098973 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.115294 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlvr\" (UniqueName: \"kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr\") pod \"ceilometer-0\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.414026 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.433328 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a87446a-b7cc-4068-91e2-b5dbbc3cda71","Type":"ContainerDied","Data":"e6983e3cb72c0c0a687dce9ef014fce5474ff36947b30a6ca0a0535243d6e751"} Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.433481 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.439031 4744 generic.go:334] "Generic (PLEG): container finished" podID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerID="60cba145747ef116205a3e302ad50bb5d4b00de1456716486451fbb9e79a9eb5" exitCode=0 Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.439117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerDied","Data":"60cba145747ef116205a3e302ad50bb5d4b00de1456716486451fbb9e79a9eb5"} Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.481625 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.526097 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.540482 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.548846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.549098 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.555642 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.556316 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.705717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706104 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706170 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-997jj\" (UniqueName: \"kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706273 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.706383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808177 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-997jj\" (UniqueName: \"kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808231 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808291 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808364 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808396 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.808418 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.809292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.810112 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.810166 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.811171 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.814576 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.815220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.815317 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.817229 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.829935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-997jj\" (UniqueName: \"kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.856072 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.882635 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.910840 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911686 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911757 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911800 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911862 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p272k\" (UniqueName: \"kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911926 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.911979 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.912008 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts\") pod \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\" (UID: \"8b84adb2-ba84-4796-b8c6-3bf51e850b3f\") " Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.912450 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.912554 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs" (OuterVolumeSpecName: "logs") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.912760 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.912775 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.916308 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k" (OuterVolumeSpecName: "kube-api-access-p272k") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "kube-api-access-p272k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.918164 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts" (OuterVolumeSpecName: "scripts") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.918744 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.956286 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:17:28 crc kubenswrapper[4744]: W0311 01:17:28.962065 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode299a80d_c2a8_4190_b2cb_a404ee99b9f5.slice/crio-1667b5c3c910322929cfee772eae6ea8b39545b4538631bae274cfcadaaceb33 WatchSource:0}: Error finding container 1667b5c3c910322929cfee772eae6ea8b39545b4538631bae274cfcadaaceb33: Status 404 returned error can't find the container with id 1667b5c3c910322929cfee772eae6ea8b39545b4538631bae274cfcadaaceb33 Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.964956 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:28 crc kubenswrapper[4744]: I0311 01:17:28.981017 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data" (OuterVolumeSpecName: "config-data") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:28.998716 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b84adb2-ba84-4796-b8c6-3bf51e850b3f" (UID: "8b84adb2-ba84-4796-b8c6-3bf51e850b3f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016103 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p272k\" (UniqueName: \"kubernetes.io/projected/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-kube-api-access-p272k\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016152 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016164 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016176 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016188 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b84adb2-ba84-4796-b8c6-3bf51e850b3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.016226 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.037487 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.117580 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.430924 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.453471 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerStarted","Data":"84b62c373e1da76600505f19a32878af7c4c1da368de43305fffec91ec07eea2"} Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.458093 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.458098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b84adb2-ba84-4796-b8c6-3bf51e850b3f","Type":"ContainerDied","Data":"bf6c891570ec1c484bd853f4d92849d190a304a07d71269d825c24136203ac8a"} Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.458201 4744 scope.go:117] "RemoveContainer" containerID="60cba145747ef116205a3e302ad50bb5d4b00de1456716486451fbb9e79a9eb5" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.462007 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerStarted","Data":"1667b5c3c910322929cfee772eae6ea8b39545b4538631bae274cfcadaaceb33"} Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.538993 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.606566 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.627010 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:29 crc kubenswrapper[4744]: E0311 01:17:29.629314 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-httpd" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.629341 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-httpd" Mar 11 01:17:29 crc kubenswrapper[4744]: E0311 01:17:29.629375 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-log" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.629381 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-log" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.631931 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-httpd" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.631959 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" containerName="glance-log" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.638534 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.643275 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.647030 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.653879 4744 scope.go:117] "RemoveContainer" containerID="8fbde8a2a9b6059d4fbff5c56a5a963290f4efef4a41a866b702d93ee81bb0ff" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.674894 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.745494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hh9\" (UniqueName: \"kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.745788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.745836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.746091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.746116 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.746140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.746161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.748819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851195 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hh9\" (UniqueName: \"kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851295 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851343 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851365 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851380 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.851406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.852820 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.853128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.853348 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.857877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.864398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.864519 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.867921 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.871067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hh9\" (UniqueName: \"kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.887923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " pod="openstack/glance-default-internal-api-0" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.989016 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a87446a-b7cc-4068-91e2-b5dbbc3cda71" path="/var/lib/kubelet/pods/3a87446a-b7cc-4068-91e2-b5dbbc3cda71/volumes" Mar 11 01:17:29 crc kubenswrapper[4744]: I0311 01:17:29.995958 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b84adb2-ba84-4796-b8c6-3bf51e850b3f" path="/var/lib/kubelet/pods/8b84adb2-ba84-4796-b8c6-3bf51e850b3f/volumes" Mar 11 01:17:30 crc kubenswrapper[4744]: I0311 01:17:30.021542 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:30 crc kubenswrapper[4744]: I0311 01:17:30.479737 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerStarted","Data":"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd"} Mar 11 01:17:30 crc kubenswrapper[4744]: I0311 01:17:30.481854 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerStarted","Data":"dda45a2a6beb4d51c68422dc1406aaa80a32f7291668639a011f39ec42b2fc96"} Mar 11 01:17:30 crc kubenswrapper[4744]: I0311 01:17:30.584224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.495770 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerStarted","Data":"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7"} Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.496216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerStarted","Data":"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38"} Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.505090 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerStarted","Data":"160c31ddfbf19b31394b583802d9b0b99a645d4e25dc64bc9887035b9c0eac27"} Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.505164 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerStarted","Data":"b86d7ab1af8b5ed8530de243a360bfc4c90e4b884444c461b1c0070bbaa146fe"} Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.507913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerStarted","Data":"5be8527746802ca45e3d648c31b7d6bd21a5227a4e2fd3aeb4816ee96bea245a"} Mar 11 01:17:31 crc kubenswrapper[4744]: I0311 01:17:31.532808 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.532793165 podStartE2EDuration="3.532793165s" podCreationTimestamp="2026-03-11 01:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:31.527841852 +0000 UTC m=+1408.332059457" watchObservedRunningTime="2026-03-11 01:17:31.532793165 +0000 UTC m=+1408.337010770" Mar 11 01:17:32 crc kubenswrapper[4744]: I0311 01:17:32.520395 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerStarted","Data":"42a9b4781df7a35e2423fc3e40c26d692ed2ab1efd0387967762e554fa2952fa"} Mar 11 01:17:32 crc kubenswrapper[4744]: I0311 01:17:32.542355 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.542334266 podStartE2EDuration="3.542334266s" podCreationTimestamp="2026-03-11 01:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:32.540092387 +0000 UTC m=+1409.344310002" watchObservedRunningTime="2026-03-11 01:17:32.542334266 +0000 UTC m=+1409.346551871" Mar 11 01:17:33 crc kubenswrapper[4744]: I0311 01:17:33.541484 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerStarted","Data":"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c"} Mar 11 01:17:33 crc kubenswrapper[4744]: I0311 01:17:33.573462 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.676302861 podStartE2EDuration="6.573447285s" podCreationTimestamp="2026-03-11 01:17:27 +0000 UTC" firstStartedPulling="2026-03-11 01:17:28.964885786 +0000 UTC m=+1405.769103391" lastFinishedPulling="2026-03-11 01:17:32.86203019 +0000 UTC m=+1409.666247815" observedRunningTime="2026-03-11 01:17:33.568441019 +0000 UTC m=+1410.372658664" watchObservedRunningTime="2026-03-11 01:17:33.573447285 +0000 UTC m=+1410.377664880" Mar 11 01:17:34 crc kubenswrapper[4744]: I0311 01:17:34.552331 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:17:35 crc kubenswrapper[4744]: I0311 01:17:35.565745 4744 generic.go:334] "Generic (PLEG): container finished" podID="c263c020-5938-4b77-b265-c297ae87f084" containerID="5609d95b861d80c9c856a5d5ae721e3df23258d3dac31b4a8d829352cafed8e3" exitCode=0 Mar 11 01:17:35 crc kubenswrapper[4744]: I0311 01:17:35.565828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-knwz7" event={"ID":"c263c020-5938-4b77-b265-c297ae87f084","Type":"ContainerDied","Data":"5609d95b861d80c9c856a5d5ae721e3df23258d3dac31b4a8d829352cafed8e3"} Mar 11 01:17:36 crc kubenswrapper[4744]: I0311 01:17:36.994887 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.087467 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data\") pod \"c263c020-5938-4b77-b265-c297ae87f084\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.087534 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle\") pod \"c263c020-5938-4b77-b265-c297ae87f084\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.087752 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts\") pod \"c263c020-5938-4b77-b265-c297ae87f084\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.087824 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2vw\" (UniqueName: \"kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw\") pod \"c263c020-5938-4b77-b265-c297ae87f084\" (UID: \"c263c020-5938-4b77-b265-c297ae87f084\") " Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.094301 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts" (OuterVolumeSpecName: "scripts") pod "c263c020-5938-4b77-b265-c297ae87f084" (UID: "c263c020-5938-4b77-b265-c297ae87f084"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.094557 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw" (OuterVolumeSpecName: "kube-api-access-dv2vw") pod "c263c020-5938-4b77-b265-c297ae87f084" (UID: "c263c020-5938-4b77-b265-c297ae87f084"). InnerVolumeSpecName "kube-api-access-dv2vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.119568 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data" (OuterVolumeSpecName: "config-data") pod "c263c020-5938-4b77-b265-c297ae87f084" (UID: "c263c020-5938-4b77-b265-c297ae87f084"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.119812 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c263c020-5938-4b77-b265-c297ae87f084" (UID: "c263c020-5938-4b77-b265-c297ae87f084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.189989 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.190023 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv2vw\" (UniqueName: \"kubernetes.io/projected/c263c020-5938-4b77-b265-c297ae87f084-kube-api-access-dv2vw\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.190034 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.190042 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c263c020-5938-4b77-b265-c297ae87f084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.586469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-knwz7" event={"ID":"c263c020-5938-4b77-b265-c297ae87f084","Type":"ContainerDied","Data":"eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405"} Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.586535 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1bfc1563630d88f6802de7f7ba17757e99ff84ba35a4b3e446fdaab503e405" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.586587 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-knwz7" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.741370 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:17:37 crc kubenswrapper[4744]: E0311 01:17:37.741804 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c263c020-5938-4b77-b265-c297ae87f084" containerName="nova-cell0-conductor-db-sync" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.741826 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c263c020-5938-4b77-b265-c297ae87f084" containerName="nova-cell0-conductor-db-sync" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.742077 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c263c020-5938-4b77-b265-c297ae87f084" containerName="nova-cell0-conductor-db-sync" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.742807 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.746442 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g7bsd" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.746716 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.764505 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.798824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959b9\" (UniqueName: \"kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.798874 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.798993 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.900570 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.900688 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959b9\" (UniqueName: \"kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.900718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.904408 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.905043 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:37 crc kubenswrapper[4744]: I0311 01:17:37.919066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959b9\" (UniqueName: \"kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9\") pod \"nova-cell0-conductor-0\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.066912 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.596308 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.882939 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.883198 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.931443 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 01:17:38 crc kubenswrapper[4744]: I0311 01:17:38.951418 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 01:17:39 crc kubenswrapper[4744]: I0311 01:17:39.606333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8caed76-baba-4ad3-b95a-e428132f2021","Type":"ContainerStarted","Data":"a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee"} Mar 11 01:17:39 crc kubenswrapper[4744]: I0311 01:17:39.607551 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8caed76-baba-4ad3-b95a-e428132f2021","Type":"ContainerStarted","Data":"4066205a4d4f854a933e6ee6717e3772cc1b97e03da805734227a2224bdc1194"} Mar 11 01:17:39 crc kubenswrapper[4744]: I0311 01:17:39.607779 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 01:17:39 crc kubenswrapper[4744]: I0311 01:17:39.608228 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 01:17:39 crc kubenswrapper[4744]: I0311 01:17:39.608305 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.022166 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.022506 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.072953 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.082974 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.107916 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.107893543 podStartE2EDuration="3.107893543s" podCreationTimestamp="2026-03-11 01:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:39.633422565 +0000 UTC m=+1416.437640170" watchObservedRunningTime="2026-03-11 01:17:40.107893543 +0000 UTC m=+1416.912111148" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.616327 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:40 crc kubenswrapper[4744]: I0311 01:17:40.616378 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:41 crc kubenswrapper[4744]: I0311 01:17:41.624193 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 01:17:41 crc kubenswrapper[4744]: I0311 01:17:41.624497 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 01:17:42 crc kubenswrapper[4744]: I0311 01:17:42.104575 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 01:17:42 crc kubenswrapper[4744]: I0311 01:17:42.124824 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 01:17:42 crc kubenswrapper[4744]: I0311 01:17:42.412274 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:42 crc kubenswrapper[4744]: I0311 01:17:42.479693 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.115289 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.634371 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-687vh"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.636731 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.640675 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.640749 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.652872 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-687vh"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.723564 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.723603 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.723628 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.723656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvjkg\" (UniqueName: \"kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.740826 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.787244 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.787662 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.791384 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.793411 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.794794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.804700 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.810447 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.825341 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45dnm\" (UniqueName: \"kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.825619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.825730 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.825828 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.825928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdvf\" (UniqueName: \"kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826403 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvjkg\" (UniqueName: \"kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.826488 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.836282 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.837110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.858607 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.868776 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvjkg\" (UniqueName: \"kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg\") pod \"nova-cell0-cell-mapping-687vh\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.887580 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.889077 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.898180 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928678 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928776 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpkn\" (UniqueName: \"kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928833 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45dnm\" (UniqueName: \"kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928880 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdvf\" (UniqueName: \"kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928959 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.928988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.929012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.929048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.929070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.933082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.939099 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.940115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.966279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.966481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.975125 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45dnm\" (UniqueName: \"kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm\") pod \"nova-api-0\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " pod="openstack/nova-api-0" Mar 11 01:17:43 crc kubenswrapper[4744]: I0311 01:17:43.975502 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.012352 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdvf\" (UniqueName: \"kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf\") pod \"nova-scheduler-0\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.030971 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.031023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.031061 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.031117 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpkn\" (UniqueName: \"kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.043022 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.044113 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.052385 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.079068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpkn\" (UniqueName: \"kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn\") pod \"nova-metadata-0\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.079286 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.084989 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.086140 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.089890 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.099014 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.192331 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.193196 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.195897 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.220316 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.273275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.273378 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.273410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.291575 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.367985 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376315 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376389 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376484 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmfj\" (UniqueName: \"kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.376538 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.380382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.385077 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.396473 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.477889 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.477941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.477980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmfj\" (UniqueName: \"kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.478039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.478104 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.478140 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.478871 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.478868 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.479486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.479666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.480124 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.499754 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.502160 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmfj\" (UniqueName: \"kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj\") pod \"dnsmasq-dns-6d5695c9cc-mrlhc\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.543170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.694021 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-687vh"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.831265 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.838141 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.884575 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-49nrf"] Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.885775 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.890487 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.890809 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 11 01:17:44 crc kubenswrapper[4744]: I0311 01:17:44.895571 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-49nrf"] Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.000016 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.000370 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.000398 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.000479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pr87\" (UniqueName: \"kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.014473 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.071416 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:17:45 crc kubenswrapper[4744]: W0311 01:17:45.072172 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc84843f_62da_435f_ab7a_2f0b97cb2ec7.slice/crio-8d33f41d8ca74edf782e308424498846720f14e130498581905dcdd9f5fb6125 WatchSource:0}: Error finding container 8d33f41d8ca74edf782e308424498846720f14e130498581905dcdd9f5fb6125: Status 404 returned error can't find the container with id 8d33f41d8ca74edf782e308424498846720f14e130498581905dcdd9f5fb6125 Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.081963 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.102268 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.102547 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.102711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pr87\" (UniqueName: \"kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.102986 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.105050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.106569 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.109470 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.120312 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pr87\" (UniqueName: \"kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87\") pod \"nova-cell1-conductor-db-sync-49nrf\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.258600 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.666360 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-49nrf"] Mar 11 01:17:45 crc kubenswrapper[4744]: W0311 01:17:45.679027 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c80e1ba_a26a_4368_902b_a725bc2052d8.slice/crio-afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a WatchSource:0}: Error finding container afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a: Status 404 returned error can't find the container with id afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.690886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerStarted","Data":"c795bb478b11c6dacec86e35e98682a85a679a58624dea4346a1ed2edd5be5e8"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.692430 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerStarted","Data":"d13a0e7a6a5c088f3c4e47b3dec6ecc876f03e4032b9f4b6334c3866f56f6e38"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.696995 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bba38e8-f452-48af-a4ad-faaff2a073e1","Type":"ContainerStarted","Data":"9e293e674f6a0c588b1add9d93c93217309b04c3a1b1533127ce328075fdbf51"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.698844 4744 generic.go:334] "Generic (PLEG): container finished" podID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerID="faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201" exitCode=0 Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.698882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" event={"ID":"fdec9153-74cd-4e53-8667-f96ed2dad143","Type":"ContainerDied","Data":"faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.698897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" event={"ID":"fdec9153-74cd-4e53-8667-f96ed2dad143","Type":"ContainerStarted","Data":"ab6b5a85661b8cd2622da648690b45d22058217032f6cb51b351ed6142c1b8a7"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.700560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc84843f-62da-435f-ab7a-2f0b97cb2ec7","Type":"ContainerStarted","Data":"8d33f41d8ca74edf782e308424498846720f14e130498581905dcdd9f5fb6125"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.702326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-687vh" event={"ID":"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303","Type":"ContainerStarted","Data":"4adb24658e3a9f0e113d48ab8079e1c019d2d01a9440a17a80fbe8fa0e414acf"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.702348 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-687vh" event={"ID":"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303","Type":"ContainerStarted","Data":"2d0b969d4f158855beb380402618e90a04cc17e71ec93df572c7c861902ef4bc"} Mar 11 01:17:45 crc kubenswrapper[4744]: I0311 01:17:45.736772 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-687vh" podStartSLOduration=2.736755536 podStartE2EDuration="2.736755536s" podCreationTimestamp="2026-03-11 01:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:45.734390813 +0000 UTC m=+1422.538608428" watchObservedRunningTime="2026-03-11 01:17:45.736755536 +0000 UTC m=+1422.540973141" Mar 11 01:17:46 crc kubenswrapper[4744]: I0311 01:17:46.716179 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" event={"ID":"fdec9153-74cd-4e53-8667-f96ed2dad143","Type":"ContainerStarted","Data":"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626"} Mar 11 01:17:46 crc kubenswrapper[4744]: I0311 01:17:46.716553 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:46 crc kubenswrapper[4744]: I0311 01:17:46.718587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-49nrf" event={"ID":"8c80e1ba-a26a-4368-902b-a725bc2052d8","Type":"ContainerStarted","Data":"b3c2039b5d3c821ce56b152f8a4817de58be05051c1b9159d5f0d6fc2571c7eb"} Mar 11 01:17:46 crc kubenswrapper[4744]: I0311 01:17:46.718619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-49nrf" event={"ID":"8c80e1ba-a26a-4368-902b-a725bc2052d8","Type":"ContainerStarted","Data":"afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a"} Mar 11 01:17:46 crc kubenswrapper[4744]: I0311 01:17:46.743466 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" podStartSLOduration=2.743447819 podStartE2EDuration="2.743447819s" podCreationTimestamp="2026-03-11 01:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:46.741959643 +0000 UTC m=+1423.546177258" watchObservedRunningTime="2026-03-11 01:17:46.743447819 +0000 UTC m=+1423.547665424" Mar 11 01:17:47 crc kubenswrapper[4744]: I0311 01:17:47.875930 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-49nrf" podStartSLOduration=3.875916651 podStartE2EDuration="3.875916651s" podCreationTimestamp="2026-03-11 01:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:46.77548475 +0000 UTC m=+1423.579702355" watchObservedRunningTime="2026-03-11 01:17:47.875916651 +0000 UTC m=+1424.680134256" Mar 11 01:17:47 crc kubenswrapper[4744]: I0311 01:17:47.883667 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:17:47 crc kubenswrapper[4744]: I0311 01:17:47.933254 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.756904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerStarted","Data":"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.757248 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerStarted","Data":"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.757087 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-metadata" containerID="cri-o://25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e" gracePeriod=30 Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.757008 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-log" containerID="cri-o://dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8" gracePeriod=30 Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.761149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bba38e8-f452-48af-a4ad-faaff2a073e1","Type":"ContainerStarted","Data":"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.763555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc84843f-62da-435f-ab7a-2f0b97cb2ec7","Type":"ContainerStarted","Data":"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.763669 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a" gracePeriod=30 Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.769887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerStarted","Data":"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.769937 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerStarted","Data":"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496"} Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.792246 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.655226995 podStartE2EDuration="5.792214138s" podCreationTimestamp="2026-03-11 01:17:43 +0000 UTC" firstStartedPulling="2026-03-11 01:17:45.024756594 +0000 UTC m=+1421.828974199" lastFinishedPulling="2026-03-11 01:17:48.161743737 +0000 UTC m=+1424.965961342" observedRunningTime="2026-03-11 01:17:48.783003034 +0000 UTC m=+1425.587220629" watchObservedRunningTime="2026-03-11 01:17:48.792214138 +0000 UTC m=+1425.596431733" Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.802707 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.476673555 podStartE2EDuration="5.802689883s" podCreationTimestamp="2026-03-11 01:17:43 +0000 UTC" firstStartedPulling="2026-03-11 01:17:44.838576198 +0000 UTC m=+1421.642793803" lastFinishedPulling="2026-03-11 01:17:48.164592516 +0000 UTC m=+1424.968810131" observedRunningTime="2026-03-11 01:17:48.800711991 +0000 UTC m=+1425.604929596" watchObservedRunningTime="2026-03-11 01:17:48.802689883 +0000 UTC m=+1425.606907488" Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.827555 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.745949461 podStartE2EDuration="5.82750582s" podCreationTimestamp="2026-03-11 01:17:43 +0000 UTC" firstStartedPulling="2026-03-11 01:17:45.078751003 +0000 UTC m=+1421.882968608" lastFinishedPulling="2026-03-11 01:17:48.160307342 +0000 UTC m=+1424.964524967" observedRunningTime="2026-03-11 01:17:48.820364299 +0000 UTC m=+1425.624581904" watchObservedRunningTime="2026-03-11 01:17:48.82750582 +0000 UTC m=+1425.631723425" Mar 11 01:17:48 crc kubenswrapper[4744]: I0311 01:17:48.845253 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.530079235 podStartE2EDuration="5.845236547s" podCreationTimestamp="2026-03-11 01:17:43 +0000 UTC" firstStartedPulling="2026-03-11 01:17:44.845581224 +0000 UTC m=+1421.649798829" lastFinishedPulling="2026-03-11 01:17:48.160738536 +0000 UTC m=+1424.964956141" observedRunningTime="2026-03-11 01:17:48.844104423 +0000 UTC m=+1425.648322028" watchObservedRunningTime="2026-03-11 01:17:48.845236547 +0000 UTC m=+1425.649454152" Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.220732 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.369269 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.369332 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.500478 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.785246 4744 generic.go:334] "Generic (PLEG): container finished" podID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerID="dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8" exitCode=143 Mar 11 01:17:49 crc kubenswrapper[4744]: I0311 01:17:49.786454 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerDied","Data":"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8"} Mar 11 01:17:51 crc kubenswrapper[4744]: I0311 01:17:51.807936 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" containerID="4adb24658e3a9f0e113d48ab8079e1c019d2d01a9440a17a80fbe8fa0e414acf" exitCode=0 Mar 11 01:17:51 crc kubenswrapper[4744]: I0311 01:17:51.808056 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-687vh" event={"ID":"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303","Type":"ContainerDied","Data":"4adb24658e3a9f0e113d48ab8079e1c019d2d01a9440a17a80fbe8fa0e414acf"} Mar 11 01:17:52 crc kubenswrapper[4744]: I0311 01:17:52.822895 4744 generic.go:334] "Generic (PLEG): container finished" podID="8c80e1ba-a26a-4368-902b-a725bc2052d8" containerID="b3c2039b5d3c821ce56b152f8a4817de58be05051c1b9159d5f0d6fc2571c7eb" exitCode=0 Mar 11 01:17:52 crc kubenswrapper[4744]: I0311 01:17:52.823031 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-49nrf" event={"ID":"8c80e1ba-a26a-4368-902b-a725bc2052d8","Type":"ContainerDied","Data":"b3c2039b5d3c821ce56b152f8a4817de58be05051c1b9159d5f0d6fc2571c7eb"} Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.339890 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.413244 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data\") pod \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.413398 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts\") pod \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.413441 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvjkg\" (UniqueName: \"kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg\") pod \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.413476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle\") pod \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\" (UID: \"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303\") " Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.422209 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg" (OuterVolumeSpecName: "kube-api-access-vvjkg") pod "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" (UID: "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303"). InnerVolumeSpecName "kube-api-access-vvjkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.422705 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts" (OuterVolumeSpecName: "scripts") pod "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" (UID: "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.453082 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" (UID: "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.464976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data" (OuterVolumeSpecName: "config-data") pod "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" (UID: "b2a44cf2-a3b1-4b65-aba3-b0a5d939b303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.516742 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.516792 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvjkg\" (UniqueName: \"kubernetes.io/projected/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-kube-api-access-vvjkg\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.516813 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.516830 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.845050 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-687vh" Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.845146 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-687vh" event={"ID":"b2a44cf2-a3b1-4b65-aba3-b0a5d939b303","Type":"ContainerDied","Data":"2d0b969d4f158855beb380402618e90a04cc17e71ec93df572c7c861902ef4bc"} Mar 11 01:17:53 crc kubenswrapper[4744]: I0311 01:17:53.845196 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d0b969d4f158855beb380402618e90a04cc17e71ec93df572c7c861902ef4bc" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.142875 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.143264 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-log" containerID="cri-o://41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" gracePeriod=30 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.143789 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-api" containerID="cri-o://06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" gracePeriod=30 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.153014 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.153764 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8bba38e8-f452-48af-a4ad-faaff2a073e1" containerName="nova-scheduler-scheduler" containerID="cri-o://f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799" gracePeriod=30 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.309557 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.334261 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data\") pod \"8c80e1ba-a26a-4368-902b-a725bc2052d8\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.334322 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle\") pod \"8c80e1ba-a26a-4368-902b-a725bc2052d8\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.334379 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pr87\" (UniqueName: \"kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87\") pod \"8c80e1ba-a26a-4368-902b-a725bc2052d8\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.334405 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts\") pod \"8c80e1ba-a26a-4368-902b-a725bc2052d8\" (UID: \"8c80e1ba-a26a-4368-902b-a725bc2052d8\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.340753 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts" (OuterVolumeSpecName: "scripts") pod "8c80e1ba-a26a-4368-902b-a725bc2052d8" (UID: "8c80e1ba-a26a-4368-902b-a725bc2052d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.347865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87" (OuterVolumeSpecName: "kube-api-access-4pr87") pod "8c80e1ba-a26a-4368-902b-a725bc2052d8" (UID: "8c80e1ba-a26a-4368-902b-a725bc2052d8"). InnerVolumeSpecName "kube-api-access-4pr87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.368738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c80e1ba-a26a-4368-902b-a725bc2052d8" (UID: "8c80e1ba-a26a-4368-902b-a725bc2052d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.394255 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data" (OuterVolumeSpecName: "config-data") pod "8c80e1ba-a26a-4368-902b-a725bc2052d8" (UID: "8c80e1ba-a26a-4368-902b-a725bc2052d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.436376 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pr87\" (UniqueName: \"kubernetes.io/projected/8c80e1ba-a26a-4368-902b-a725bc2052d8-kube-api-access-4pr87\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.436414 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.436424 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.436434 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c80e1ba-a26a-4368-902b-a725bc2052d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.545461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.618644 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.619102 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="dnsmasq-dns" containerID="cri-o://61f19b7521fd2006eb5a4290593fd4b6486d40a70ee7549d87d0824f9c940f4b" gracePeriod=10 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.742077 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.843418 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data\") pod \"58321737-f97a-47e0-9d0d-07f0c0da801c\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.843499 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs\") pod \"58321737-f97a-47e0-9d0d-07f0c0da801c\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.843574 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle\") pod \"58321737-f97a-47e0-9d0d-07f0c0da801c\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.843595 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45dnm\" (UniqueName: \"kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm\") pod \"58321737-f97a-47e0-9d0d-07f0c0da801c\" (UID: \"58321737-f97a-47e0-9d0d-07f0c0da801c\") " Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.843764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs" (OuterVolumeSpecName: "logs") pod "58321737-f97a-47e0-9d0d-07f0c0da801c" (UID: "58321737-f97a-47e0-9d0d-07f0c0da801c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.844239 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58321737-f97a-47e0-9d0d-07f0c0da801c-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.850032 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm" (OuterVolumeSpecName: "kube-api-access-45dnm") pod "58321737-f97a-47e0-9d0d-07f0c0da801c" (UID: "58321737-f97a-47e0-9d0d-07f0c0da801c"). InnerVolumeSpecName "kube-api-access-45dnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.853917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-49nrf" event={"ID":"8c80e1ba-a26a-4368-902b-a725bc2052d8","Type":"ContainerDied","Data":"afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a"} Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.853952 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afcb941d5bf7ef06289e1c0d7fa0fb8669145513b4300707aaa40dfc915bc66a" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.854009 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-49nrf" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.857332 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerID="61f19b7521fd2006eb5a4290593fd4b6486d40a70ee7549d87d0824f9c940f4b" exitCode=0 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.857393 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" event={"ID":"f0da3325-8792-44bf-8c25-ea1648998ce0","Type":"ContainerDied","Data":"61f19b7521fd2006eb5a4290593fd4b6486d40a70ee7549d87d0824f9c940f4b"} Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860046 4744 generic.go:334] "Generic (PLEG): container finished" podID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerID="06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" exitCode=0 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860087 4744 generic.go:334] "Generic (PLEG): container finished" podID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerID="41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" exitCode=143 Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerDied","Data":"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade"} Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerDied","Data":"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496"} Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860176 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"58321737-f97a-47e0-9d0d-07f0c0da801c","Type":"ContainerDied","Data":"c795bb478b11c6dacec86e35e98682a85a679a58624dea4346a1ed2edd5be5e8"} Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860191 4744 scope.go:117] "RemoveContainer" containerID="06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.860325 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.871483 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58321737-f97a-47e0-9d0d-07f0c0da801c" (UID: "58321737-f97a-47e0-9d0d-07f0c0da801c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.882717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data" (OuterVolumeSpecName: "config-data") pod "58321737-f97a-47e0-9d0d-07f0c0da801c" (UID: "58321737-f97a-47e0-9d0d-07f0c0da801c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.893114 4744 scope.go:117] "RemoveContainer" containerID="41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.939873 4744 scope.go:117] "RemoveContainer" containerID="06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.940300 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade\": container with ID starting with 06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade not found: ID does not exist" containerID="06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.940427 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade"} err="failed to get container status \"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade\": rpc error: code = NotFound desc = could not find container \"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade\": container with ID starting with 06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade not found: ID does not exist" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.940453 4744 scope.go:117] "RemoveContainer" containerID="41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.940833 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496\": container with ID starting with 41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496 not found: ID does not exist" containerID="41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.940893 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496"} err="failed to get container status \"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496\": rpc error: code = NotFound desc = could not find container \"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496\": container with ID starting with 41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496 not found: ID does not exist" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.940920 4744 scope.go:117] "RemoveContainer" containerID="06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.941190 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade"} err="failed to get container status \"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade\": rpc error: code = NotFound desc = could not find container \"06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade\": container with ID starting with 06f5f6362a043cce313cb040a3b791d7e0d5abe4b264e0fa53e9884c7bbfbade not found: ID does not exist" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.941262 4744 scope.go:117] "RemoveContainer" containerID="41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.941692 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496"} err="failed to get container status \"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496\": rpc error: code = NotFound desc = could not find container \"41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496\": container with ID starting with 41ee13882dd112c3caf87f564785d3ec291e97d6c3853b8dc41f5aaeb3bed496 not found: ID does not exist" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.945256 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.945288 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45dnm\" (UniqueName: \"kubernetes.io/projected/58321737-f97a-47e0-9d0d-07f0c0da801c-kube-api-access-45dnm\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.945299 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58321737-f97a-47e0-9d0d-07f0c0da801c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.946834 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.947220 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c80e1ba-a26a-4368-902b-a725bc2052d8" containerName="nova-cell1-conductor-db-sync" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947240 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c80e1ba-a26a-4368-902b-a725bc2052d8" containerName="nova-cell1-conductor-db-sync" Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.947253 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-log" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947260 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-log" Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.947277 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-api" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947283 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-api" Mar 11 01:17:54 crc kubenswrapper[4744]: E0311 01:17:54.947295 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" containerName="nova-manage" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947301 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" containerName="nova-manage" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947467 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" containerName="nova-manage" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947480 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-log" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947493 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c80e1ba-a26a-4368-902b-a725bc2052d8" containerName="nova-cell1-conductor-db-sync" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.947521 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" containerName="nova-api-api" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.948113 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.950809 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 01:17:54 crc kubenswrapper[4744]: I0311 01:17:54.958530 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.018994 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151626 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151723 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151747 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrndz\" (UniqueName: \"kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.151925 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb\") pod \"f0da3325-8792-44bf-8c25-ea1648998ce0\" (UID: \"f0da3325-8792-44bf-8c25-ea1648998ce0\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.152463 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.152564 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.152692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdhz\" (UniqueName: \"kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.154804 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz" (OuterVolumeSpecName: "kube-api-access-qrndz") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "kube-api-access-qrndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.201009 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config" (OuterVolumeSpecName: "config") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.216013 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.217191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.231769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.232856 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.236773 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.248144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0da3325-8792-44bf-8c25-ea1648998ce0" (UID: "f0da3325-8792-44bf-8c25-ea1648998ce0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.250866 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:55 crc kubenswrapper[4744]: E0311 01:17:55.251385 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="init" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.251411 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="init" Mar 11 01:17:55 crc kubenswrapper[4744]: E0311 01:17:55.251435 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="dnsmasq-dns" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.251447 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="dnsmasq-dns" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.251821 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" containerName="dnsmasq-dns" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.253499 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257282 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwc2\" (UniqueName: \"kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257397 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257463 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257550 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257594 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257656 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdhz\" (UniqueName: \"kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257771 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrndz\" (UniqueName: \"kubernetes.io/projected/f0da3325-8792-44bf-8c25-ea1648998ce0-kube-api-access-qrndz\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257793 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257813 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257830 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257846 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.257862 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0da3325-8792-44bf-8c25-ea1648998ce0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.262279 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.262413 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.265797 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.276334 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.280081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdhz\" (UniqueName: \"kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz\") pod \"nova-cell1-conductor-0\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.359715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.360157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwc2\" (UniqueName: \"kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.360243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.360308 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.364196 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.364498 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.368921 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.386546 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwc2\" (UniqueName: \"kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2\") pod \"nova-api-0\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.571116 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.650960 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.824041 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.877981 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data\") pod \"8bba38e8-f452-48af-a4ad-faaff2a073e1\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.878079 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle\") pod \"8bba38e8-f452-48af-a4ad-faaff2a073e1\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.878358 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfdvf\" (UniqueName: \"kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf\") pod \"8bba38e8-f452-48af-a4ad-faaff2a073e1\" (UID: \"8bba38e8-f452-48af-a4ad-faaff2a073e1\") " Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.884051 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bba38e8-f452-48af-a4ad-faaff2a073e1" containerID="f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799" exitCode=0 Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.884119 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bba38e8-f452-48af-a4ad-faaff2a073e1","Type":"ContainerDied","Data":"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799"} Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.884148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bba38e8-f452-48af-a4ad-faaff2a073e1","Type":"ContainerDied","Data":"9e293e674f6a0c588b1add9d93c93217309b04c3a1b1533127ce328075fdbf51"} Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.884172 4744 scope.go:117] "RemoveContainer" containerID="f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.884317 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.886776 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf" (OuterVolumeSpecName: "kube-api-access-vfdvf") pod "8bba38e8-f452-48af-a4ad-faaff2a073e1" (UID: "8bba38e8-f452-48af-a4ad-faaff2a073e1"). InnerVolumeSpecName "kube-api-access-vfdvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.887795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" event={"ID":"f0da3325-8792-44bf-8c25-ea1648998ce0","Type":"ContainerDied","Data":"4d1d6320f4a5a49cef0e6a90e4d4eb21c6a7f97d68920e4d072b31cf62d39bef"} Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.887918 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-rxvv5" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.904304 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data" (OuterVolumeSpecName: "config-data") pod "8bba38e8-f452-48af-a4ad-faaff2a073e1" (UID: "8bba38e8-f452-48af-a4ad-faaff2a073e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.917102 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bba38e8-f452-48af-a4ad-faaff2a073e1" (UID: "8bba38e8-f452-48af-a4ad-faaff2a073e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.917257 4744 scope.go:117] "RemoveContainer" containerID="f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799" Mar 11 01:17:55 crc kubenswrapper[4744]: E0311 01:17:55.920887 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799\": container with ID starting with f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799 not found: ID does not exist" containerID="f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.920919 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799"} err="failed to get container status \"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799\": rpc error: code = NotFound desc = could not find container \"f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799\": container with ID starting with f671b28b3ee9b4cdab7c31f8abce45e38f5f93c13ddae0c389624e0ccaa2c799 not found: ID does not exist" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.920941 4744 scope.go:117] "RemoveContainer" containerID="61f19b7521fd2006eb5a4290593fd4b6486d40a70ee7549d87d0824f9c940f4b" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.927951 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.938964 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-rxvv5"] Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.945831 4744 scope.go:117] "RemoveContainer" containerID="1bcc4d3e8cc07e39ed1ba82d03612482772b57681aefd2a6680379ab70250b00" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.981805 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfdvf\" (UniqueName: \"kubernetes.io/projected/8bba38e8-f452-48af-a4ad-faaff2a073e1-kube-api-access-vfdvf\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.981839 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.981849 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bba38e8-f452-48af-a4ad-faaff2a073e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.987741 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58321737-f97a-47e0-9d0d-07f0c0da801c" path="/var/lib/kubelet/pods/58321737-f97a-47e0-9d0d-07f0c0da801c/volumes" Mar 11 01:17:55 crc kubenswrapper[4744]: I0311 01:17:55.988322 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0da3325-8792-44bf-8c25-ea1648998ce0" path="/var/lib/kubelet/pods/f0da3325-8792-44bf-8c25-ea1648998ce0/volumes" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.056174 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.207477 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.225350 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.239658 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.256365 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: E0311 01:17:56.257129 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bba38e8-f452-48af-a4ad-faaff2a073e1" containerName="nova-scheduler-scheduler" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.257177 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bba38e8-f452-48af-a4ad-faaff2a073e1" containerName="nova-scheduler-scheduler" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.257703 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bba38e8-f452-48af-a4ad-faaff2a073e1" containerName="nova-scheduler-scheduler" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.275724 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.275854 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.279204 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.390900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.390949 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.390974 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6rs\" (UniqueName: \"kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.493332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.493658 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.493689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6rs\" (UniqueName: \"kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.498538 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.498953 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.521126 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6rs\" (UniqueName: \"kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs\") pod \"nova-scheduler-0\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.659109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.919485 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05c279dc-d915-4688-b2c2-c43ff96ad81c","Type":"ContainerStarted","Data":"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a"} Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.919806 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05c279dc-d915-4688-b2c2-c43ff96ad81c","Type":"ContainerStarted","Data":"dd4cb4f38a20994d21f06ee2df11baf41bc2a4c201f5816dee201c234f762b5e"} Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.921026 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.948093 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerStarted","Data":"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab"} Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.948133 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerStarted","Data":"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c"} Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.948145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerStarted","Data":"7cb8ccc865629d7c5966d50f75339a2b93f7a84552e98ec2bd0136d6af973a49"} Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.950371 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.950357017 podStartE2EDuration="2.950357017s" podCreationTimestamp="2026-03-11 01:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:56.941103552 +0000 UTC m=+1433.745321157" watchObservedRunningTime="2026-03-11 01:17:56.950357017 +0000 UTC m=+1433.754574622" Mar 11 01:17:56 crc kubenswrapper[4744]: I0311 01:17:56.964086 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.964072162 podStartE2EDuration="1.964072162s" podCreationTimestamp="2026-03-11 01:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:56.96108022 +0000 UTC m=+1433.765297825" watchObservedRunningTime="2026-03-11 01:17:56.964072162 +0000 UTC m=+1433.768289767" Mar 11 01:17:57 crc kubenswrapper[4744]: I0311 01:17:57.208004 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:17:57 crc kubenswrapper[4744]: W0311 01:17:57.216391 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c264104_8890_4de4_bb6e_451dcfeb5d4c.slice/crio-e5c4074cdccb6f7b737e8968321fbed22b5fca5a70511c8b5fcc6a60effdeb74 WatchSource:0}: Error finding container e5c4074cdccb6f7b737e8968321fbed22b5fca5a70511c8b5fcc6a60effdeb74: Status 404 returned error can't find the container with id e5c4074cdccb6f7b737e8968321fbed22b5fca5a70511c8b5fcc6a60effdeb74 Mar 11 01:17:57 crc kubenswrapper[4744]: I0311 01:17:57.960467 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c264104-8890-4de4-bb6e-451dcfeb5d4c","Type":"ContainerStarted","Data":"8051c02f9fe6225ef189ca6f441fcfb1b8e9a1ad9bf4f8c2ce0b2b41f29d27a3"} Mar 11 01:17:57 crc kubenswrapper[4744]: I0311 01:17:57.960884 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c264104-8890-4de4-bb6e-451dcfeb5d4c","Type":"ContainerStarted","Data":"e5c4074cdccb6f7b737e8968321fbed22b5fca5a70511c8b5fcc6a60effdeb74"} Mar 11 01:17:57 crc kubenswrapper[4744]: I0311 01:17:57.981825 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.981784268 podStartE2EDuration="1.981784268s" podCreationTimestamp="2026-03-11 01:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:17:57.981551901 +0000 UTC m=+1434.785769536" watchObservedRunningTime="2026-03-11 01:17:57.981784268 +0000 UTC m=+1434.786001883" Mar 11 01:17:57 crc kubenswrapper[4744]: I0311 01:17:57.990977 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bba38e8-f452-48af-a4ad-faaff2a073e1" path="/var/lib/kubelet/pods/8bba38e8-f452-48af-a4ad-faaff2a073e1/volumes" Mar 11 01:17:58 crc kubenswrapper[4744]: I0311 01:17:58.420287 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.147643 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553198-skc7b"] Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.150569 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.152730 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.152984 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.153111 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.159891 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553198-skc7b"] Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.269453 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6h25\" (UniqueName: \"kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25\") pod \"auto-csr-approver-29553198-skc7b\" (UID: \"29b550d6-9fea-497a-84d0-4f05d34f3013\") " pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.375270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6h25\" (UniqueName: \"kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25\") pod \"auto-csr-approver-29553198-skc7b\" (UID: \"29b550d6-9fea-497a-84d0-4f05d34f3013\") " pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.414305 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6h25\" (UniqueName: \"kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25\") pod \"auto-csr-approver-29553198-skc7b\" (UID: \"29b550d6-9fea-497a-84d0-4f05d34f3013\") " pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:00 crc kubenswrapper[4744]: I0311 01:18:00.477889 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:01 crc kubenswrapper[4744]: I0311 01:18:01.003636 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553198-skc7b"] Mar 11 01:18:01 crc kubenswrapper[4744]: I0311 01:18:01.021348 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553198-skc7b" event={"ID":"29b550d6-9fea-497a-84d0-4f05d34f3013","Type":"ContainerStarted","Data":"75032575afa2166874b8eb9463b03efb3e024a1fe83e26b107f7f6b23d5189b3"} Mar 11 01:18:01 crc kubenswrapper[4744]: I0311 01:18:01.660269 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 01:18:02 crc kubenswrapper[4744]: I0311 01:18:02.647285 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:02 crc kubenswrapper[4744]: I0311 01:18:02.647609 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerName="kube-state-metrics" containerID="cri-o://d75b57ad203a91708de204fd623e2cbc3a859d6cce5a5c47588de99db8116945" gracePeriod=30 Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.049437 4744 generic.go:334] "Generic (PLEG): container finished" podID="29b550d6-9fea-497a-84d0-4f05d34f3013" containerID="5525f25b06e6ff4c2d8a48534b38c68ef8ef4cf848bcb4cccfe6dca7d1cdfb4f" exitCode=0 Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.050145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553198-skc7b" event={"ID":"29b550d6-9fea-497a-84d0-4f05d34f3013","Type":"ContainerDied","Data":"5525f25b06e6ff4c2d8a48534b38c68ef8ef4cf848bcb4cccfe6dca7d1cdfb4f"} Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.051954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbc15fc8-0b29-47ad-9ce1-38097df24920","Type":"ContainerDied","Data":"d75b57ad203a91708de204fd623e2cbc3a859d6cce5a5c47588de99db8116945"} Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.051910 4744 generic.go:334] "Generic (PLEG): container finished" podID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerID="d75b57ad203a91708de204fd623e2cbc3a859d6cce5a5c47588de99db8116945" exitCode=2 Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.197647 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.341121 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pfp7\" (UniqueName: \"kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7\") pod \"dbc15fc8-0b29-47ad-9ce1-38097df24920\" (UID: \"dbc15fc8-0b29-47ad-9ce1-38097df24920\") " Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.351771 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7" (OuterVolumeSpecName: "kube-api-access-7pfp7") pod "dbc15fc8-0b29-47ad-9ce1-38097df24920" (UID: "dbc15fc8-0b29-47ad-9ce1-38097df24920"). InnerVolumeSpecName "kube-api-access-7pfp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:03 crc kubenswrapper[4744]: I0311 01:18:03.443974 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pfp7\" (UniqueName: \"kubernetes.io/projected/dbc15fc8-0b29-47ad-9ce1-38097df24920-kube-api-access-7pfp7\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.068143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbc15fc8-0b29-47ad-9ce1-38097df24920","Type":"ContainerDied","Data":"bfe213e4876d0e50b49fe6b5cad33d848045c2de73079970d00b4c5a897f47d0"} Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.068180 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.068573 4744 scope.go:117] "RemoveContainer" containerID="d75b57ad203a91708de204fd623e2cbc3a859d6cce5a5c47588de99db8116945" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.111612 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.125429 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.135885 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:04 crc kubenswrapper[4744]: E0311 01:18:04.136318 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerName="kube-state-metrics" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.136338 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerName="kube-state-metrics" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.136592 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerName="kube-state-metrics" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.137441 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.140127 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.141263 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.151465 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.262630 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.262715 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22d5\" (UniqueName: \"kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.262839 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.262927 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.364434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.364582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.364626 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.364678 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22d5\" (UniqueName: \"kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.370273 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.371773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.384061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.388591 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22d5\" (UniqueName: \"kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5\") pod \"kube-state-metrics-0\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.456444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.467679 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.511098 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.511402 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-central-agent" containerID="cri-o://b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd" gracePeriod=30 Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.511629 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="proxy-httpd" containerID="cri-o://941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c" gracePeriod=30 Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.511982 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-notification-agent" containerID="cri-o://26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38" gracePeriod=30 Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.512178 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="sg-core" containerID="cri-o://27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7" gracePeriod=30 Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.567962 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6h25\" (UniqueName: \"kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25\") pod \"29b550d6-9fea-497a-84d0-4f05d34f3013\" (UID: \"29b550d6-9fea-497a-84d0-4f05d34f3013\") " Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.572409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25" (OuterVolumeSpecName: "kube-api-access-t6h25") pod "29b550d6-9fea-497a-84d0-4f05d34f3013" (UID: "29b550d6-9fea-497a-84d0-4f05d34f3013"). InnerVolumeSpecName "kube-api-access-t6h25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.670988 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6h25\" (UniqueName: \"kubernetes.io/projected/29b550d6-9fea-497a-84d0-4f05d34f3013-kube-api-access-t6h25\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:04 crc kubenswrapper[4744]: I0311 01:18:04.954828 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.087458 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc403516-137f-4bfb-badf-89b13ff0468f","Type":"ContainerStarted","Data":"994e2dcca379b67f8c0d3e0ed206d13d02031ef6c1b9c3b81ae18aeb2f10e9a7"} Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.089657 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553198-skc7b" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.089666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553198-skc7b" event={"ID":"29b550d6-9fea-497a-84d0-4f05d34f3013","Type":"ContainerDied","Data":"75032575afa2166874b8eb9463b03efb3e024a1fe83e26b107f7f6b23d5189b3"} Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.089719 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75032575afa2166874b8eb9463b03efb3e024a1fe83e26b107f7f6b23d5189b3" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099737 4744 generic.go:334] "Generic (PLEG): container finished" podID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerID="941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c" exitCode=0 Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099786 4744 generic.go:334] "Generic (PLEG): container finished" podID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerID="27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7" exitCode=2 Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099802 4744 generic.go:334] "Generic (PLEG): container finished" podID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerID="b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd" exitCode=0 Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099836 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerDied","Data":"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c"} Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099876 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerDied","Data":"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7"} Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.099896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerDied","Data":"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd"} Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.561174 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553192-ggv6v"] Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.567750 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553192-ggv6v"] Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.602099 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.651456 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.655104 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.985035 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adfdc5e7-2770-4cdb-ad17-194d0ca0fa59" path="/var/lib/kubelet/pods/adfdc5e7-2770-4cdb-ad17-194d0ca0fa59/volumes" Mar 11 01:18:05 crc kubenswrapper[4744]: I0311 01:18:05.985654 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" path="/var/lib/kubelet/pods/dbc15fc8-0b29-47ad-9ce1-38097df24920/volumes" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.110011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc403516-137f-4bfb-badf-89b13ff0468f","Type":"ContainerStarted","Data":"97d144a3e5f0e96a87be2e26bd2f1bc7509fe6d661b46663a89c2c893e10f70a"} Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.110102 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.133137 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7220356140000002 podStartE2EDuration="2.133118236s" podCreationTimestamp="2026-03-11 01:18:04 +0000 UTC" firstStartedPulling="2026-03-11 01:18:04.964106447 +0000 UTC m=+1441.768324052" lastFinishedPulling="2026-03-11 01:18:05.375189049 +0000 UTC m=+1442.179406674" observedRunningTime="2026-03-11 01:18:06.128559295 +0000 UTC m=+1442.932776930" watchObservedRunningTime="2026-03-11 01:18:06.133118236 +0000 UTC m=+1442.937335851" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.573926 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.660939 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.703068 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.721980 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722061 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722151 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722205 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjlvr\" (UniqueName: \"kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722260 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722372 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.722436 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd\") pod \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\" (UID: \"e299a80d-c2a8-4190-b2cb-a404ee99b9f5\") " Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.723747 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.724124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.729422 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts" (OuterVolumeSpecName: "scripts") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.730685 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr" (OuterVolumeSpecName: "kube-api-access-sjlvr") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "kube-api-access-sjlvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.733797 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.734033 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.770368 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.822564 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.830984 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.831053 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.831072 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjlvr\" (UniqueName: \"kubernetes.io/projected/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-kube-api-access-sjlvr\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.831134 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.831156 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.831172 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.849377 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data" (OuterVolumeSpecName: "config-data") pod "e299a80d-c2a8-4190-b2cb-a404ee99b9f5" (UID: "e299a80d-c2a8-4190-b2cb-a404ee99b9f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:06 crc kubenswrapper[4744]: I0311 01:18:06.932739 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e299a80d-c2a8-4190-b2cb-a404ee99b9f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.124477 4744 generic.go:334] "Generic (PLEG): container finished" podID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerID="26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38" exitCode=0 Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.125219 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.130492 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerDied","Data":"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38"} Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.130561 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e299a80d-c2a8-4190-b2cb-a404ee99b9f5","Type":"ContainerDied","Data":"1667b5c3c910322929cfee772eae6ea8b39545b4538631bae274cfcadaaceb33"} Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.130601 4744 scope.go:117] "RemoveContainer" containerID="941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.195008 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.209988 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.227744 4744 scope.go:117] "RemoveContainer" containerID="27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.250958 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.251701 4744 scope.go:117] "RemoveContainer" containerID="26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.262552 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.263091 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="proxy-httpd" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.263117 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="proxy-httpd" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.263150 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-notification-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.263161 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-notification-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.263175 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-central-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.263185 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-central-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.263209 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b550d6-9fea-497a-84d0-4f05d34f3013" containerName="oc" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.263217 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b550d6-9fea-497a-84d0-4f05d34f3013" containerName="oc" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.263232 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="sg-core" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.263240 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="sg-core" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.264096 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="proxy-httpd" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.264136 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b550d6-9fea-497a-84d0-4f05d34f3013" containerName="oc" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.264151 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-notification-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.264161 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="ceilometer-central-agent" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.264184 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" containerName="sg-core" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.266482 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.269882 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.272373 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.272586 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.272605 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.275143 4744 scope.go:117] "RemoveContainer" containerID="b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.299450 4744 scope.go:117] "RemoveContainer" containerID="941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.299946 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c\": container with ID starting with 941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c not found: ID does not exist" containerID="941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.299981 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c"} err="failed to get container status \"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c\": rpc error: code = NotFound desc = could not find container \"941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c\": container with ID starting with 941b0df49d3da04acb2ac798557e24a4e2d133a647f0e6fd039b6c4cde93013c not found: ID does not exist" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.300009 4744 scope.go:117] "RemoveContainer" containerID="27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.301137 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7\": container with ID starting with 27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7 not found: ID does not exist" containerID="27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.301164 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7"} err="failed to get container status \"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7\": rpc error: code = NotFound desc = could not find container \"27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7\": container with ID starting with 27d321a2123e96f7f11cd7464ce98813df92a1151e05ab39f0c79b6c42f650c7 not found: ID does not exist" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.301184 4744 scope.go:117] "RemoveContainer" containerID="26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.303268 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38\": container with ID starting with 26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38 not found: ID does not exist" containerID="26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.303318 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38"} err="failed to get container status \"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38\": rpc error: code = NotFound desc = could not find container \"26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38\": container with ID starting with 26f1374ee9bcb35d8f9c8b0d52f93d739569de3c5aa6c00bda3ffedc20a99f38 not found: ID does not exist" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.303351 4744 scope.go:117] "RemoveContainer" containerID="b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd" Mar 11 01:18:07 crc kubenswrapper[4744]: E0311 01:18:07.303679 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd\": container with ID starting with b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd not found: ID does not exist" containerID="b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.303742 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd"} err="failed to get container status \"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd\": rpc error: code = NotFound desc = could not find container \"b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd\": container with ID starting with b7274dc9b470e8a8adf0b1898dfd5c73a4f7caa83f9024a193d2a3365a9703bd not found: ID does not exist" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339400 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4hqr\" (UniqueName: \"kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339573 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.339966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.340026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.422196 4744 scope.go:117] "RemoveContainer" containerID="9c6dc98a464ea9f49c43579ba760e067869199edfe8854675ae6108fd84e037d" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.441732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.441779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.441892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.441963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.442026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4hqr\" (UniqueName: \"kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.442065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.442085 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.442106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.443918 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.444788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.450137 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.451344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.451477 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.454389 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.456720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.465837 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4hqr\" (UniqueName: \"kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr\") pod \"ceilometer-0\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.591622 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.944115 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="dbc15fc8-0b29-47ad-9ce1-38097df24920" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:07 crc kubenswrapper[4744]: I0311 01:18:07.985105 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e299a80d-c2a8-4190-b2cb-a404ee99b9f5" path="/var/lib/kubelet/pods/e299a80d-c2a8-4190-b2cb-a404ee99b9f5/volumes" Mar 11 01:18:08 crc kubenswrapper[4744]: I0311 01:18:08.106021 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:08 crc kubenswrapper[4744]: I0311 01:18:08.133255 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerStarted","Data":"d411d1627762ad3f660bf03062026d28e1a3dcab9b4f882ef3a07d5933e2a73a"} Mar 11 01:18:09 crc kubenswrapper[4744]: I0311 01:18:09.158703 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerStarted","Data":"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4"} Mar 11 01:18:10 crc kubenswrapper[4744]: I0311 01:18:10.174141 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerStarted","Data":"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5"} Mar 11 01:18:10 crc kubenswrapper[4744]: I0311 01:18:10.174949 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerStarted","Data":"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d"} Mar 11 01:18:12 crc kubenswrapper[4744]: I0311 01:18:12.193383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerStarted","Data":"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef"} Mar 11 01:18:12 crc kubenswrapper[4744]: I0311 01:18:12.193764 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:18:12 crc kubenswrapper[4744]: I0311 01:18:12.222211 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.644873199 podStartE2EDuration="5.22219436s" podCreationTimestamp="2026-03-11 01:18:07 +0000 UTC" firstStartedPulling="2026-03-11 01:18:08.116414075 +0000 UTC m=+1444.920631680" lastFinishedPulling="2026-03-11 01:18:11.693735206 +0000 UTC m=+1448.497952841" observedRunningTime="2026-03-11 01:18:12.216850435 +0000 UTC m=+1449.021068040" watchObservedRunningTime="2026-03-11 01:18:12.22219436 +0000 UTC m=+1449.026411965" Mar 11 01:18:14 crc kubenswrapper[4744]: I0311 01:18:14.489834 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 01:18:15 crc kubenswrapper[4744]: I0311 01:18:15.657582 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 01:18:15 crc kubenswrapper[4744]: I0311 01:18:15.658348 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 01:18:15 crc kubenswrapper[4744]: I0311 01:18:15.658679 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 01:18:15 crc kubenswrapper[4744]: I0311 01:18:15.662211 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.241310 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.244468 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.424977 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.426552 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.457450 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.541889 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.541930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.541952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxqx\" (UniqueName: \"kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.541979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.542029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.542079 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.643854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.643934 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.643997 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.644019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.644035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qxqx\" (UniqueName: \"kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.644058 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.645061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.645261 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.645280 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.645720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.645745 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.665692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qxqx\" (UniqueName: \"kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx\") pod \"dnsmasq-dns-57db588689-85hjj\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:16 crc kubenswrapper[4744]: I0311 01:18:16.764597 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.252058 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.971884 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.973557 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-central-agent" containerID="cri-o://5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" gracePeriod=30 Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.973731 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="proxy-httpd" containerID="cri-o://ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" gracePeriod=30 Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.973628 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="sg-core" containerID="cri-o://fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" gracePeriod=30 Mar 11 01:18:17 crc kubenswrapper[4744]: I0311 01:18:17.973645 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-notification-agent" containerID="cri-o://be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" gracePeriod=30 Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.259688 4744 generic.go:334] "Generic (PLEG): container finished" podID="1bee1784-ff56-4039-88d3-712a12673f83" containerID="ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" exitCode=0 Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.259964 4744 generic.go:334] "Generic (PLEG): container finished" podID="1bee1784-ff56-4039-88d3-712a12673f83" containerID="fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" exitCode=2 Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.260010 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerDied","Data":"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef"} Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.260041 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerDied","Data":"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5"} Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.261796 4744 generic.go:334] "Generic (PLEG): container finished" podID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerID="2209bfaa2ae01c45df64e0b09d4046efe85b860f64ac4e58bdaf01ff673e8e61" exitCode=0 Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.263239 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-85hjj" event={"ID":"a90e479a-2c1d-4a55-9f51-eadbc3c0b333","Type":"ContainerDied","Data":"2209bfaa2ae01c45df64e0b09d4046efe85b860f64ac4e58bdaf01ff673e8e61"} Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.263275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-85hjj" event={"ID":"a90e479a-2c1d-4a55-9f51-eadbc3c0b333","Type":"ContainerStarted","Data":"86516c5d8e12a17e965bdf690e72c90c2e9beacbec6d679fd93897701dc40234"} Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.750169 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.875914 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884590 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4hqr\" (UniqueName: \"kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884668 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884701 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884845 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884885 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884916 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.884989 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.885034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml\") pod \"1bee1784-ff56-4039-88d3-712a12673f83\" (UID: \"1bee1784-ff56-4039-88d3-712a12673f83\") " Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.885756 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.886243 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.886709 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.901963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr" (OuterVolumeSpecName: "kube-api-access-q4hqr") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "kube-api-access-q4hqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.910931 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts" (OuterVolumeSpecName: "scripts") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.932479 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.974500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.984617 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987374 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4hqr\" (UniqueName: \"kubernetes.io/projected/1bee1784-ff56-4039-88d3-712a12673f83-kube-api-access-q4hqr\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987404 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987414 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987425 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bee1784-ff56-4039-88d3-712a12673f83-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987433 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:18 crc kubenswrapper[4744]: I0311 01:18:18.987441 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.026741 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data" (OuterVolumeSpecName: "config-data") pod "1bee1784-ff56-4039-88d3-712a12673f83" (UID: "1bee1784-ff56-4039-88d3-712a12673f83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.090844 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bee1784-ff56-4039-88d3-712a12673f83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.167793 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3533d92_44d2_4d39_98af_b144b8a57d24.slice/crio-conmon-25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc84843f_62da_435f_ab7a_2f0b97cb2ec7.slice/crio-bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc84843f_62da_435f_ab7a_2f0b97cb2ec7.slice/crio-conmon-bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a.scope\": RecentStats: unable to find data in memory cache]" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.181606 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.239017 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.271617 4744 generic.go:334] "Generic (PLEG): container finished" podID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" containerID="bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a" exitCode=137 Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.271671 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc84843f-62da-435f-ab7a-2f0b97cb2ec7","Type":"ContainerDied","Data":"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.271695 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc84843f-62da-435f-ab7a-2f0b97cb2ec7","Type":"ContainerDied","Data":"8d33f41d8ca74edf782e308424498846720f14e130498581905dcdd9f5fb6125"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.271712 4744 scope.go:117] "RemoveContainer" containerID="bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.271798 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.277873 4744 generic.go:334] "Generic (PLEG): container finished" podID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerID="25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e" exitCode=137 Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.277917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerDied","Data":"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.277938 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3533d92-44d2-4d39-98af-b144b8a57d24","Type":"ContainerDied","Data":"d13a0e7a6a5c088f3c4e47b3dec6ecc876f03e4032b9f4b6334c3866f56f6e38"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.277988 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.292821 4744 generic.go:334] "Generic (PLEG): container finished" podID="1bee1784-ff56-4039-88d3-712a12673f83" containerID="be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" exitCode=0 Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.292848 4744 generic.go:334] "Generic (PLEG): container finished" podID="1bee1784-ff56-4039-88d3-712a12673f83" containerID="5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" exitCode=0 Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.292947 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.293902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerDied","Data":"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.293936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerDied","Data":"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.293966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bee1784-ff56-4039-88d3-712a12673f83","Type":"ContainerDied","Data":"d411d1627762ad3f660bf03062026d28e1a3dcab9b4f882ef3a07d5933e2a73a"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.294876 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle\") pod \"b3533d92-44d2-4d39-98af-b144b8a57d24\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.294972 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data\") pod \"b3533d92-44d2-4d39-98af-b144b8a57d24\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.294995 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bpkn\" (UniqueName: \"kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn\") pod \"b3533d92-44d2-4d39-98af-b144b8a57d24\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.295139 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs\") pod \"b3533d92-44d2-4d39-98af-b144b8a57d24\" (UID: \"b3533d92-44d2-4d39-98af-b144b8a57d24\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.295956 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs" (OuterVolumeSpecName: "logs") pod "b3533d92-44d2-4d39-98af-b144b8a57d24" (UID: "b3533d92-44d2-4d39-98af-b144b8a57d24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.303729 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn" (OuterVolumeSpecName: "kube-api-access-7bpkn") pod "b3533d92-44d2-4d39-98af-b144b8a57d24" (UID: "b3533d92-44d2-4d39-98af-b144b8a57d24"). InnerVolumeSpecName "kube-api-access-7bpkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.307801 4744 scope.go:117] "RemoveContainer" containerID="bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.308558 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-log" containerID="cri-o://6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c" gracePeriod=30 Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.308906 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a\": container with ID starting with bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a not found: ID does not exist" containerID="bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.308940 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a"} err="failed to get container status \"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a\": rpc error: code = NotFound desc = could not find container \"bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a\": container with ID starting with bcc387b1f7c3301e27c6c065a87239db00c147780619df2a1d2b02142eb7d24a not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.308965 4744 scope.go:117] "RemoveContainer" containerID="25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.309051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-85hjj" event={"ID":"a90e479a-2c1d-4a55-9f51-eadbc3c0b333","Type":"ContainerStarted","Data":"7d9f021a66654b3256ddc109f1f5944e4eaa779a0ef70e35b1d3260089fa29e0"} Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.309353 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-api" containerID="cri-o://ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab" gracePeriod=30 Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.310197 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.331214 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3533d92-44d2-4d39-98af-b144b8a57d24" (UID: "b3533d92-44d2-4d39-98af-b144b8a57d24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.348653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data" (OuterVolumeSpecName: "config-data") pod "b3533d92-44d2-4d39-98af-b144b8a57d24" (UID: "b3533d92-44d2-4d39-98af-b144b8a57d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.351936 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57db588689-85hjj" podStartSLOduration=3.3519148420000002 podStartE2EDuration="3.351914842s" podCreationTimestamp="2026-03-11 01:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:19.329296262 +0000 UTC m=+1456.133513867" watchObservedRunningTime="2026-03-11 01:18:19.351914842 +0000 UTC m=+1456.156132447" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.355901 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.364137 4744 scope.go:117] "RemoveContainer" containerID="dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.365468 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373422 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373788 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="proxy-httpd" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373806 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="proxy-httpd" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373817 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="sg-core" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373824 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="sg-core" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373838 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-log" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373844 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-log" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373853 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-metadata" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373859 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-metadata" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373870 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373875 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373885 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-notification-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373892 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-notification-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.373911 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-central-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.373917 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-central-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374091 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-central-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374105 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-metadata" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374116 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374128 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" containerName="nova-metadata-log" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374147 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="sg-core" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374155 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="proxy-httpd" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.374168 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bee1784-ff56-4039-88d3-712a12673f83" containerName="ceilometer-notification-agent" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.375665 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.380647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.381366 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.381621 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.385677 4744 scope.go:117] "RemoveContainer" containerID="25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.391313 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e\": container with ID starting with 25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e not found: ID does not exist" containerID="25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.391372 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e"} err="failed to get container status \"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e\": rpc error: code = NotFound desc = could not find container \"25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e\": container with ID starting with 25c1da7394be4605cbf94d9355b0b895fb9fdc9afff53546f8e0d667b5f8079e not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.391402 4744 scope.go:117] "RemoveContainer" containerID="dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.391890 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8\": container with ID starting with dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8 not found: ID does not exist" containerID="dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.391914 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8"} err="failed to get container status \"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8\": rpc error: code = NotFound desc = could not find container \"dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8\": container with ID starting with dff26498a77553141f4341df2dfeb49a40cf70ec9c2f38dcec47d27d6f5caeb8 not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.391930 4744 scope.go:117] "RemoveContainer" containerID="ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.393191 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.396489 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data\") pod \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.396617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7\") pod \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.396728 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle\") pod \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\" (UID: \"dc84843f-62da-435f-ab7a-2f0b97cb2ec7\") " Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.397427 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.397445 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3533d92-44d2-4d39-98af-b144b8a57d24-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.397458 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bpkn\" (UniqueName: \"kubernetes.io/projected/b3533d92-44d2-4d39-98af-b144b8a57d24-kube-api-access-7bpkn\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.397473 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3533d92-44d2-4d39-98af-b144b8a57d24-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.405868 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7" (OuterVolumeSpecName: "kube-api-access-xnlz7") pod "dc84843f-62da-435f-ab7a-2f0b97cb2ec7" (UID: "dc84843f-62da-435f-ab7a-2f0b97cb2ec7"). InnerVolumeSpecName "kube-api-access-xnlz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.412102 4744 scope.go:117] "RemoveContainer" containerID="fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.424051 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data" (OuterVolumeSpecName: "config-data") pod "dc84843f-62da-435f-ab7a-2f0b97cb2ec7" (UID: "dc84843f-62da-435f-ab7a-2f0b97cb2ec7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.430649 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc84843f-62da-435f-ab7a-2f0b97cb2ec7" (UID: "dc84843f-62da-435f-ab7a-2f0b97cb2ec7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.434274 4744 scope.go:117] "RemoveContainer" containerID="be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.451055 4744 scope.go:117] "RemoveContainer" containerID="5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.467251 4744 scope.go:117] "RemoveContainer" containerID="ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.467765 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef\": container with ID starting with ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef not found: ID does not exist" containerID="ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.467803 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef"} err="failed to get container status \"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef\": rpc error: code = NotFound desc = could not find container \"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef\": container with ID starting with ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.467831 4744 scope.go:117] "RemoveContainer" containerID="fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.468127 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5\": container with ID starting with fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5 not found: ID does not exist" containerID="fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.468221 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5"} err="failed to get container status \"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5\": rpc error: code = NotFound desc = could not find container \"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5\": container with ID starting with fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5 not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.468306 4744 scope.go:117] "RemoveContainer" containerID="be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.468835 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d\": container with ID starting with be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d not found: ID does not exist" containerID="be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.468877 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d"} err="failed to get container status \"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d\": rpc error: code = NotFound desc = could not find container \"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d\": container with ID starting with be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.468905 4744 scope.go:117] "RemoveContainer" containerID="5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" Mar 11 01:18:19 crc kubenswrapper[4744]: E0311 01:18:19.469171 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4\": container with ID starting with 5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4 not found: ID does not exist" containerID="5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469192 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4"} err="failed to get container status \"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4\": rpc error: code = NotFound desc = could not find container \"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4\": container with ID starting with 5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4 not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469236 4744 scope.go:117] "RemoveContainer" containerID="ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469526 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef"} err="failed to get container status \"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef\": rpc error: code = NotFound desc = could not find container \"ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef\": container with ID starting with ec4e280db95c84686de34a625cacc26d4cf4bc1dd750752fd8c14d891fe45fef not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469550 4744 scope.go:117] "RemoveContainer" containerID="fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469805 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5"} err="failed to get container status \"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5\": rpc error: code = NotFound desc = could not find container \"fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5\": container with ID starting with fe5bf707cc7cffd0ef3b3eec3df56eba54b5f2e86ea757dfaf7a54a13d9b0ca5 not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.469884 4744 scope.go:117] "RemoveContainer" containerID="be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.470210 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d"} err="failed to get container status \"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d\": rpc error: code = NotFound desc = could not find container \"be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d\": container with ID starting with be7d3138b53e64e82042ff5e16061e4d40ad8c23934828c967856a0dbe84411d not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.470227 4744 scope.go:117] "RemoveContainer" containerID="5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.470438 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4"} err="failed to get container status \"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4\": rpc error: code = NotFound desc = could not find container \"5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4\": container with ID starting with 5c8650307b1a8dfabd69bc4e53acceafe7fc6c4f04b8a68fe4486a9686450aa4 not found: ID does not exist" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499666 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499714 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499901 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8p8b\" (UniqueName: \"kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.499953 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.500071 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.500274 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.500407 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.500425 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.500436 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/dc84843f-62da-435f-ab7a-2f0b97cb2ec7-kube-api-access-xnlz7\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.601808 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.601896 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.601990 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602050 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602082 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602153 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8p8b\" (UniqueName: \"kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602646 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.602424 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.606007 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.606416 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.607502 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.612710 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.616949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.618916 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.619295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.631778 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8p8b\" (UniqueName: \"kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b\") pod \"ceilometer-0\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.634232 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.642459 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.650192 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.651607 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.653479 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.653861 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.654041 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.670454 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.679360 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.681546 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.684206 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.684279 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.688546 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.711446 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807074 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807137 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbv6n\" (UniqueName: \"kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807247 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807271 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807301 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.807566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kpr\" (UniqueName: \"kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909599 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbv6n\" (UniqueName: \"kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909729 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909751 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909780 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kpr\" (UniqueName: \"kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909862 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.909881 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.910432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.914205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.917400 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.917504 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.920287 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.921830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.921923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.937227 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.939319 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kpr\" (UniqueName: \"kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr\") pod \"nova-cell1-novncproxy-0\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.948140 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbv6n\" (UniqueName: \"kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n\") pod \"nova-metadata-0\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " pod="openstack/nova-metadata-0" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.993488 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bee1784-ff56-4039-88d3-712a12673f83" path="/var/lib/kubelet/pods/1bee1784-ff56-4039-88d3-712a12673f83/volumes" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.994534 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3533d92-44d2-4d39-98af-b144b8a57d24" path="/var/lib/kubelet/pods/b3533d92-44d2-4d39-98af-b144b8a57d24/volumes" Mar 11 01:18:19 crc kubenswrapper[4744]: I0311 01:18:19.995072 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc84843f-62da-435f-ab7a-2f0b97cb2ec7" path="/var/lib/kubelet/pods/dc84843f-62da-435f-ab7a-2f0b97cb2ec7/volumes" Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.135182 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.142745 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.189061 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:20 crc kubenswrapper[4744]: W0311 01:18:20.219662 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7268c12_3602_488a_8ef9_606bdf629c99.slice/crio-cb0a2054a5692fc75a0ec24bb05c20b763e3484d4f2527532b6e4394beb864dd WatchSource:0}: Error finding container cb0a2054a5692fc75a0ec24bb05c20b763e3484d4f2527532b6e4394beb864dd: Status 404 returned error can't find the container with id cb0a2054a5692fc75a0ec24bb05c20b763e3484d4f2527532b6e4394beb864dd Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.329915 4744 generic.go:334] "Generic (PLEG): container finished" podID="0160a3e2-e1dd-4526-9280-1645846cee12" containerID="6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c" exitCode=143 Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.329981 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerDied","Data":"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c"} Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.331201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerStarted","Data":"cb0a2054a5692fc75a0ec24bb05c20b763e3484d4f2527532b6e4394beb864dd"} Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.695121 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:18:20 crc kubenswrapper[4744]: I0311 01:18:20.705401 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.347015 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerStarted","Data":"ed6bbcf4cf54a519f10c91d716632ee4fb9dcf21de0115d4e485910375920b05"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.349307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerStarted","Data":"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.349342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerStarted","Data":"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.349355 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerStarted","Data":"807a9412bd19874bec1274e32a33626c4ff50e2eac32261c4c24c34125cdedbf"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.352313 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"755410bb-361b-47e2-8a7a-317119eec983","Type":"ContainerStarted","Data":"fd95bdd6dfcd2af9ddc8f3dd4babec6abaf1feb9cf63d194943025c2b4b217ae"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.352345 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"755410bb-361b-47e2-8a7a-317119eec983","Type":"ContainerStarted","Data":"a3fe2f00533c3bbd5584921efe34313f88a72aac2e80a0dc9792ce9767bc2e5a"} Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.395023 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.394994201 podStartE2EDuration="2.394994201s" podCreationTimestamp="2026-03-11 01:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:21.378892473 +0000 UTC m=+1458.183110088" watchObservedRunningTime="2026-03-11 01:18:21.394994201 +0000 UTC m=+1458.199211826" Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.410884 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.410869542 podStartE2EDuration="2.410869542s" podCreationTimestamp="2026-03-11 01:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:21.407034643 +0000 UTC m=+1458.211252248" watchObservedRunningTime="2026-03-11 01:18:21.410869542 +0000 UTC m=+1458.215087147" Mar 11 01:18:21 crc kubenswrapper[4744]: I0311 01:18:21.830352 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:22 crc kubenswrapper[4744]: I0311 01:18:22.371469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerStarted","Data":"a709b7f7989928cd941295bf2b17bba6771bd6e57e2b98bebfe372bab0638bfb"} Mar 11 01:18:22 crc kubenswrapper[4744]: I0311 01:18:22.953154 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.068032 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle\") pod \"0160a3e2-e1dd-4526-9280-1645846cee12\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.068092 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwc2\" (UniqueName: \"kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2\") pod \"0160a3e2-e1dd-4526-9280-1645846cee12\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.068181 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs\") pod \"0160a3e2-e1dd-4526-9280-1645846cee12\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.068264 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data\") pod \"0160a3e2-e1dd-4526-9280-1645846cee12\" (UID: \"0160a3e2-e1dd-4526-9280-1645846cee12\") " Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.069227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs" (OuterVolumeSpecName: "logs") pod "0160a3e2-e1dd-4526-9280-1645846cee12" (UID: "0160a3e2-e1dd-4526-9280-1645846cee12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.096114 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2" (OuterVolumeSpecName: "kube-api-access-6pwc2") pod "0160a3e2-e1dd-4526-9280-1645846cee12" (UID: "0160a3e2-e1dd-4526-9280-1645846cee12"). InnerVolumeSpecName "kube-api-access-6pwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.112815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data" (OuterVolumeSpecName: "config-data") pod "0160a3e2-e1dd-4526-9280-1645846cee12" (UID: "0160a3e2-e1dd-4526-9280-1645846cee12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.145227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0160a3e2-e1dd-4526-9280-1645846cee12" (UID: "0160a3e2-e1dd-4526-9280-1645846cee12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.172617 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.172651 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0160a3e2-e1dd-4526-9280-1645846cee12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.172663 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pwc2\" (UniqueName: \"kubernetes.io/projected/0160a3e2-e1dd-4526-9280-1645846cee12-kube-api-access-6pwc2\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.172672 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0160a3e2-e1dd-4526-9280-1645846cee12-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.378983 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerStarted","Data":"c2793377c0ca5ce05edccec9e4d03d1a1f0b2c742156faccf67df9cdf71392bc"} Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.380721 4744 generic.go:334] "Generic (PLEG): container finished" podID="0160a3e2-e1dd-4526-9280-1645846cee12" containerID="ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab" exitCode=0 Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.380761 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.380763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerDied","Data":"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab"} Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.380795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0160a3e2-e1dd-4526-9280-1645846cee12","Type":"ContainerDied","Data":"7cb8ccc865629d7c5966d50f75339a2b93f7a84552e98ec2bd0136d6af973a49"} Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.380812 4744 scope.go:117] "RemoveContainer" containerID="ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.400709 4744 scope.go:117] "RemoveContainer" containerID="6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.412436 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.421912 4744 scope.go:117] "RemoveContainer" containerID="ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab" Mar 11 01:18:23 crc kubenswrapper[4744]: E0311 01:18:23.422364 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab\": container with ID starting with ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab not found: ID does not exist" containerID="ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.422406 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab"} err="failed to get container status \"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab\": rpc error: code = NotFound desc = could not find container \"ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab\": container with ID starting with ebddc0586887b8e1b258bb855afecb531e54592cb034d8ab733121c6c2b0e3ab not found: ID does not exist" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.422435 4744 scope.go:117] "RemoveContainer" containerID="6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c" Mar 11 01:18:23 crc kubenswrapper[4744]: E0311 01:18:23.422911 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c\": container with ID starting with 6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c not found: ID does not exist" containerID="6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.422944 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c"} err="failed to get container status \"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c\": rpc error: code = NotFound desc = could not find container \"6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c\": container with ID starting with 6d4e750ef38a2bd9997b7929eb792b4bf47d1020fd0488a54a44b9a96efb1a9c not found: ID does not exist" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.427770 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.436764 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:23 crc kubenswrapper[4744]: E0311 01:18:23.437153 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-api" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.437166 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-api" Mar 11 01:18:23 crc kubenswrapper[4744]: E0311 01:18:23.437199 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-log" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.437206 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-log" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.437368 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-log" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.437391 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" containerName="nova-api-api" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.438293 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.440803 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.440976 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.441095 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.445442 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.578632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pc4\" (UniqueName: \"kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.578673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.578880 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.579148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.579232 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.579432 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681788 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681831 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681864 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pc4\" (UniqueName: \"kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681882 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.681919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.682798 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.686463 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.686917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.686991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.688407 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.702142 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pc4\" (UniqueName: \"kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4\") pod \"nova-api-0\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.787854 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:23 crc kubenswrapper[4744]: I0311 01:18:23.992147 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0160a3e2-e1dd-4526-9280-1645846cee12" path="/var/lib/kubelet/pods/0160a3e2-e1dd-4526-9280-1645846cee12/volumes" Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.241914 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:24 crc kubenswrapper[4744]: W0311 01:18:24.252281 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8e3c2f5_67c1_4e20_ab17_99638fa00963.slice/crio-15265107423e927b6bdc79df3a3ef75b1f099234a494eec2c86eeba9ff5e682e WatchSource:0}: Error finding container 15265107423e927b6bdc79df3a3ef75b1f099234a494eec2c86eeba9ff5e682e: Status 404 returned error can't find the container with id 15265107423e927b6bdc79df3a3ef75b1f099234a494eec2c86eeba9ff5e682e Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.397190 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerStarted","Data":"d029e25bc26b2d86f29d6c0a65d281983dda59b6ce3ee62f24f6b8bf2dac6563"} Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.397380 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-central-agent" containerID="cri-o://ed6bbcf4cf54a519f10c91d716632ee4fb9dcf21de0115d4e485910375920b05" gracePeriod=30 Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.397676 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.398027 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="proxy-httpd" containerID="cri-o://d029e25bc26b2d86f29d6c0a65d281983dda59b6ce3ee62f24f6b8bf2dac6563" gracePeriod=30 Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.398085 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="sg-core" containerID="cri-o://c2793377c0ca5ce05edccec9e4d03d1a1f0b2c742156faccf67df9cdf71392bc" gracePeriod=30 Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.398131 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-notification-agent" containerID="cri-o://a709b7f7989928cd941295bf2b17bba6771bd6e57e2b98bebfe372bab0638bfb" gracePeriod=30 Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.404393 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerStarted","Data":"15265107423e927b6bdc79df3a3ef75b1f099234a494eec2c86eeba9ff5e682e"} Mar 11 01:18:24 crc kubenswrapper[4744]: I0311 01:18:24.435465 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.553755577 podStartE2EDuration="5.435447517s" podCreationTimestamp="2026-03-11 01:18:19 +0000 UTC" firstStartedPulling="2026-03-11 01:18:20.222328079 +0000 UTC m=+1457.026545694" lastFinishedPulling="2026-03-11 01:18:24.104020029 +0000 UTC m=+1460.908237634" observedRunningTime="2026-03-11 01:18:24.426164049 +0000 UTC m=+1461.230381664" watchObservedRunningTime="2026-03-11 01:18:24.435447517 +0000 UTC m=+1461.239665132" Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.135635 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.143550 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.143606 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.418180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerStarted","Data":"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276"} Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.418227 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerStarted","Data":"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04"} Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.436138 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7268c12-3602-488a-8ef9-606bdf629c99" containerID="c2793377c0ca5ce05edccec9e4d03d1a1f0b2c742156faccf67df9cdf71392bc" exitCode=2 Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.436184 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7268c12-3602-488a-8ef9-606bdf629c99" containerID="a709b7f7989928cd941295bf2b17bba6771bd6e57e2b98bebfe372bab0638bfb" exitCode=0 Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.436214 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerDied","Data":"c2793377c0ca5ce05edccec9e4d03d1a1f0b2c742156faccf67df9cdf71392bc"} Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.436274 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerDied","Data":"a709b7f7989928cd941295bf2b17bba6771bd6e57e2b98bebfe372bab0638bfb"} Mar 11 01:18:25 crc kubenswrapper[4744]: I0311 01:18:25.448578 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.44855136 podStartE2EDuration="2.44855136s" podCreationTimestamp="2026-03-11 01:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:25.437469117 +0000 UTC m=+1462.241686722" watchObservedRunningTime="2026-03-11 01:18:25.44855136 +0000 UTC m=+1462.252768975" Mar 11 01:18:26 crc kubenswrapper[4744]: I0311 01:18:26.766838 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:18:26 crc kubenswrapper[4744]: I0311 01:18:26.847879 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:18:26 crc kubenswrapper[4744]: I0311 01:18:26.848358 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="dnsmasq-dns" containerID="cri-o://0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626" gracePeriod=10 Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.390599 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.465634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.465733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.465901 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttmfj\" (UniqueName: \"kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.466015 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.466047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.466087 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb\") pod \"fdec9153-74cd-4e53-8667-f96ed2dad143\" (UID: \"fdec9153-74cd-4e53-8667-f96ed2dad143\") " Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.478438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj" (OuterVolumeSpecName: "kube-api-access-ttmfj") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "kube-api-access-ttmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.490592 4744 generic.go:334] "Generic (PLEG): container finished" podID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerID="0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626" exitCode=0 Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.490659 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" event={"ID":"fdec9153-74cd-4e53-8667-f96ed2dad143","Type":"ContainerDied","Data":"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626"} Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.490686 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" event={"ID":"fdec9153-74cd-4e53-8667-f96ed2dad143","Type":"ContainerDied","Data":"ab6b5a85661b8cd2622da648690b45d22058217032f6cb51b351ed6142c1b8a7"} Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.490704 4744 scope.go:117] "RemoveContainer" containerID="0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.490785 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-mrlhc" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.494033 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7268c12-3602-488a-8ef9-606bdf629c99" containerID="ed6bbcf4cf54a519f10c91d716632ee4fb9dcf21de0115d4e485910375920b05" exitCode=0 Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.494076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerDied","Data":"ed6bbcf4cf54a519f10c91d716632ee4fb9dcf21de0115d4e485910375920b05"} Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.524185 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config" (OuterVolumeSpecName: "config") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.528119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.539106 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.540168 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.541428 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdec9153-74cd-4e53-8667-f96ed2dad143" (UID: "fdec9153-74cd-4e53-8667-f96ed2dad143"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.546840 4744 scope.go:117] "RemoveContainer" containerID="faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.565703 4744 scope.go:117] "RemoveContainer" containerID="0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626" Mar 11 01:18:27 crc kubenswrapper[4744]: E0311 01:18:27.566268 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626\": container with ID starting with 0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626 not found: ID does not exist" containerID="0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.566300 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626"} err="failed to get container status \"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626\": rpc error: code = NotFound desc = could not find container \"0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626\": container with ID starting with 0caaea6d05fbcba96a10135749cd92f5a5e02dfbefa1e25d0c526e1a6a0e3626 not found: ID does not exist" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.566324 4744 scope.go:117] "RemoveContainer" containerID="faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201" Mar 11 01:18:27 crc kubenswrapper[4744]: E0311 01:18:27.566645 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201\": container with ID starting with faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201 not found: ID does not exist" containerID="faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.566669 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201"} err="failed to get container status \"faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201\": rpc error: code = NotFound desc = could not find container \"faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201\": container with ID starting with faebff480caa3f73170f473f21cac0b165adaaae53a55bde98bc11afd3fc6201 not found: ID does not exist" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567775 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567788 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567796 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567804 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567813 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdec9153-74cd-4e53-8667-f96ed2dad143-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:27 crc kubenswrapper[4744]: I0311 01:18:27.567822 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttmfj\" (UniqueName: \"kubernetes.io/projected/fdec9153-74cd-4e53-8667-f96ed2dad143-kube-api-access-ttmfj\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:28 crc kubenswrapper[4744]: I0311 01:18:27.854565 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:18:28 crc kubenswrapper[4744]: I0311 01:18:27.858320 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-mrlhc"] Mar 11 01:18:28 crc kubenswrapper[4744]: I0311 01:18:27.991665 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" path="/var/lib/kubelet/pods/fdec9153-74cd-4e53-8667-f96ed2dad143/volumes" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.135908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.142939 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.143003 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.167834 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.553932 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.725297 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pw8fv"] Mar 11 01:18:30 crc kubenswrapper[4744]: E0311 01:18:30.725805 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="init" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.725828 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="init" Mar 11 01:18:30 crc kubenswrapper[4744]: E0311 01:18:30.725861 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="dnsmasq-dns" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.725871 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="dnsmasq-dns" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.726080 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdec9153-74cd-4e53-8667-f96ed2dad143" containerName="dnsmasq-dns" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.726945 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.731898 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.732239 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.737729 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pw8fv"] Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.853714 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.853754 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgvlj\" (UniqueName: \"kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.853819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.854057 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.955928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.955990 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgvlj\" (UniqueName: \"kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.956588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.957207 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.965242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.967447 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.971606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:30 crc kubenswrapper[4744]: I0311 01:18:30.972180 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgvlj\" (UniqueName: \"kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj\") pod \"nova-cell1-cell-mapping-pw8fv\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:31 crc kubenswrapper[4744]: I0311 01:18:31.047570 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:31 crc kubenswrapper[4744]: I0311 01:18:31.156955 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:31 crc kubenswrapper[4744]: I0311 01:18:31.164731 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:31 crc kubenswrapper[4744]: I0311 01:18:31.519069 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pw8fv"] Mar 11 01:18:31 crc kubenswrapper[4744]: I0311 01:18:31.545749 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pw8fv" event={"ID":"86af3abb-ed19-4d12-9eb1-da0f54a41fcc","Type":"ContainerStarted","Data":"4d085f2f3bcb5333c071fbda7ea02cf2e40aa262b97f29340fcd503ee9bb5021"} Mar 11 01:18:32 crc kubenswrapper[4744]: I0311 01:18:32.558387 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pw8fv" event={"ID":"86af3abb-ed19-4d12-9eb1-da0f54a41fcc","Type":"ContainerStarted","Data":"39f29b5976ac2aa60fcfaaddf31122cfd7fb5d5f0ee9a848396ad10963415872"} Mar 11 01:18:33 crc kubenswrapper[4744]: I0311 01:18:33.788841 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:33 crc kubenswrapper[4744]: I0311 01:18:33.790850 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:34 crc kubenswrapper[4744]: I0311 01:18:34.805708 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:34 crc kubenswrapper[4744]: I0311 01:18:34.805746 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:36 crc kubenswrapper[4744]: I0311 01:18:36.620230 4744 generic.go:334] "Generic (PLEG): container finished" podID="86af3abb-ed19-4d12-9eb1-da0f54a41fcc" containerID="39f29b5976ac2aa60fcfaaddf31122cfd7fb5d5f0ee9a848396ad10963415872" exitCode=0 Mar 11 01:18:36 crc kubenswrapper[4744]: I0311 01:18:36.620353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pw8fv" event={"ID":"86af3abb-ed19-4d12-9eb1-da0f54a41fcc","Type":"ContainerDied","Data":"39f29b5976ac2aa60fcfaaddf31122cfd7fb5d5f0ee9a848396ad10963415872"} Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.077594 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.219778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgvlj\" (UniqueName: \"kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj\") pod \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.219914 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data\") pod \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.220036 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle\") pod \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.220101 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts\") pod \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\" (UID: \"86af3abb-ed19-4d12-9eb1-da0f54a41fcc\") " Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.225649 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts" (OuterVolumeSpecName: "scripts") pod "86af3abb-ed19-4d12-9eb1-da0f54a41fcc" (UID: "86af3abb-ed19-4d12-9eb1-da0f54a41fcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.225817 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj" (OuterVolumeSpecName: "kube-api-access-cgvlj") pod "86af3abb-ed19-4d12-9eb1-da0f54a41fcc" (UID: "86af3abb-ed19-4d12-9eb1-da0f54a41fcc"). InnerVolumeSpecName "kube-api-access-cgvlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.247675 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data" (OuterVolumeSpecName: "config-data") pod "86af3abb-ed19-4d12-9eb1-da0f54a41fcc" (UID: "86af3abb-ed19-4d12-9eb1-da0f54a41fcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.248112 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86af3abb-ed19-4d12-9eb1-da0f54a41fcc" (UID: "86af3abb-ed19-4d12-9eb1-da0f54a41fcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.323248 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.323298 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.323318 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.323336 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgvlj\" (UniqueName: \"kubernetes.io/projected/86af3abb-ed19-4d12-9eb1-da0f54a41fcc-kube-api-access-cgvlj\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.648778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pw8fv" event={"ID":"86af3abb-ed19-4d12-9eb1-da0f54a41fcc","Type":"ContainerDied","Data":"4d085f2f3bcb5333c071fbda7ea02cf2e40aa262b97f29340fcd503ee9bb5021"} Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.648838 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d085f2f3bcb5333c071fbda7ea02cf2e40aa262b97f29340fcd503ee9bb5021" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.648863 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pw8fv" Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.923115 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.923400 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-log" containerID="cri-o://85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276" gracePeriod=30 Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.923547 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-api" containerID="cri-o://eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04" gracePeriod=30 Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.937735 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:38 crc kubenswrapper[4744]: I0311 01:18:38.937928 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" containerName="nova-scheduler-scheduler" containerID="cri-o://8051c02f9fe6225ef189ca6f441fcfb1b8e9a1ad9bf4f8c2ce0b2b41f29d27a3" gracePeriod=30 Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.014665 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.015135 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-log" containerID="cri-o://8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988" gracePeriod=30 Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.015197 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-metadata" containerID="cri-o://c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354" gracePeriod=30 Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.669171 4744 generic.go:334] "Generic (PLEG): container finished" podID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerID="85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276" exitCode=143 Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.669278 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerDied","Data":"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276"} Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.671113 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerID="8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988" exitCode=143 Mar 11 01:18:39 crc kubenswrapper[4744]: I0311 01:18:39.671162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerDied","Data":"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988"} Mar 11 01:18:40 crc kubenswrapper[4744]: I0311 01:18:40.685188 4744 generic.go:334] "Generic (PLEG): container finished" podID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" containerID="8051c02f9fe6225ef189ca6f441fcfb1b8e9a1ad9bf4f8c2ce0b2b41f29d27a3" exitCode=0 Mar 11 01:18:40 crc kubenswrapper[4744]: I0311 01:18:40.685422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c264104-8890-4de4-bb6e-451dcfeb5d4c","Type":"ContainerDied","Data":"8051c02f9fe6225ef189ca6f441fcfb1b8e9a1ad9bf4f8c2ce0b2b41f29d27a3"} Mar 11 01:18:40 crc kubenswrapper[4744]: I0311 01:18:40.967906 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.121766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data\") pod \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.121814 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6rs\" (UniqueName: \"kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs\") pod \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.121906 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle\") pod \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\" (UID: \"0c264104-8890-4de4-bb6e-451dcfeb5d4c\") " Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.137846 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs" (OuterVolumeSpecName: "kube-api-access-sw6rs") pod "0c264104-8890-4de4-bb6e-451dcfeb5d4c" (UID: "0c264104-8890-4de4-bb6e-451dcfeb5d4c"). InnerVolumeSpecName "kube-api-access-sw6rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.162558 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data" (OuterVolumeSpecName: "config-data") pod "0c264104-8890-4de4-bb6e-451dcfeb5d4c" (UID: "0c264104-8890-4de4-bb6e-451dcfeb5d4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.172715 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c264104-8890-4de4-bb6e-451dcfeb5d4c" (UID: "0c264104-8890-4de4-bb6e-451dcfeb5d4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.223962 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.224000 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw6rs\" (UniqueName: \"kubernetes.io/projected/0c264104-8890-4de4-bb6e-451dcfeb5d4c-kube-api-access-sw6rs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.224015 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264104-8890-4de4-bb6e-451dcfeb5d4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.697571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c264104-8890-4de4-bb6e-451dcfeb5d4c","Type":"ContainerDied","Data":"e5c4074cdccb6f7b737e8968321fbed22b5fca5a70511c8b5fcc6a60effdeb74"} Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.697637 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.697651 4744 scope.go:117] "RemoveContainer" containerID="8051c02f9fe6225ef189ca6f441fcfb1b8e9a1ad9bf4f8c2ce0b2b41f29d27a3" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.787559 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.800606 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.809803 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:41 crc kubenswrapper[4744]: E0311 01:18:41.810174 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" containerName="nova-scheduler-scheduler" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.810191 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" containerName="nova-scheduler-scheduler" Mar 11 01:18:41 crc kubenswrapper[4744]: E0311 01:18:41.810232 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86af3abb-ed19-4d12-9eb1-da0f54a41fcc" containerName="nova-manage" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.810241 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="86af3abb-ed19-4d12-9eb1-da0f54a41fcc" containerName="nova-manage" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.810454 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" containerName="nova-scheduler-scheduler" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.810482 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="86af3abb-ed19-4d12-9eb1-da0f54a41fcc" containerName="nova-manage" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.811141 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.813981 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.823215 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.936172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkrs\" (UniqueName: \"kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.936556 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.936610 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:41 crc kubenswrapper[4744]: I0311 01:18:41.994736 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c264104-8890-4de4-bb6e-451dcfeb5d4c" path="/var/lib/kubelet/pods/0c264104-8890-4de4-bb6e-451dcfeb5d4c/volumes" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.038536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.038584 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.038670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkrs\" (UniqueName: \"kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.050247 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.050664 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.056347 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkrs\" (UniqueName: \"kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs\") pod \"nova-scheduler-0\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.133669 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.556091 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.603608 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.652892 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653129 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653233 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5pc4\" (UniqueName: \"kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653294 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653321 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs\") pod \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\" (UID: \"a8e3c2f5-67c1-4e20-ab17-99638fa00963\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.653870 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs" (OuterVolumeSpecName: "logs") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.659133 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4" (OuterVolumeSpecName: "kube-api-access-f5pc4") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "kube-api-access-f5pc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.693227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.710945 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data" (OuterVolumeSpecName: "config-data") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.711026 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerID="c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354" exitCode=0 Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.711089 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.711110 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerDied","Data":"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354"} Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.711141 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6","Type":"ContainerDied","Data":"807a9412bd19874bec1274e32a33626c4ff50e2eac32261c4c24c34125cdedbf"} Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.711164 4744 scope.go:117] "RemoveContainer" containerID="c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.716110 4744 generic.go:334] "Generic (PLEG): container finished" podID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerID="eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04" exitCode=0 Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.716164 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerDied","Data":"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04"} Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.716187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8e3c2f5-67c1-4e20-ab17-99638fa00963","Type":"ContainerDied","Data":"15265107423e927b6bdc79df3a3ef75b1f099234a494eec2c86eeba9ff5e682e"} Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.716305 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.718652 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.727641 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a8e3c2f5-67c1-4e20-ab17-99638fa00963" (UID: "a8e3c2f5-67c1-4e20-ab17-99638fa00963"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.735766 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.744638 4744 scope.go:117] "RemoveContainer" containerID="8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988" Mar 11 01:18:42 crc kubenswrapper[4744]: W0311 01:18:42.748931 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb62ac51a_a222_4e7b_b465_9e71c3d34b1f.slice/crio-6344ff515d5d03e1c58a3b38592f19215b4ffbf6761bba85ef02c15059c6f620 WatchSource:0}: Error finding container 6344ff515d5d03e1c58a3b38592f19215b4ffbf6761bba85ef02c15059c6f620: Status 404 returned error can't find the container with id 6344ff515d5d03e1c58a3b38592f19215b4ffbf6761bba85ef02c15059c6f620 Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.754554 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data\") pod \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.754703 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs\") pod \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.754766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle\") pod \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.754878 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs\") pod \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.754914 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbv6n\" (UniqueName: \"kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n\") pod \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\" (UID: \"c5eee60b-3a4a-45b9-b25b-d3073e8e64f6\") " Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755205 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs" (OuterVolumeSpecName: "logs") pod "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" (UID: "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755503 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755540 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755553 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5pc4\" (UniqueName: \"kubernetes.io/projected/a8e3c2f5-67c1-4e20-ab17-99638fa00963-kube-api-access-f5pc4\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755562 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e3c2f5-67c1-4e20-ab17-99638fa00963-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755571 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755581 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.755589 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8e3c2f5-67c1-4e20-ab17-99638fa00963-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.758044 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n" (OuterVolumeSpecName: "kube-api-access-dbv6n") pod "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" (UID: "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6"). InnerVolumeSpecName "kube-api-access-dbv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.764395 4744 scope.go:117] "RemoveContainer" containerID="c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354" Mar 11 01:18:42 crc kubenswrapper[4744]: E0311 01:18:42.764885 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354\": container with ID starting with c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354 not found: ID does not exist" containerID="c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.764925 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354"} err="failed to get container status \"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354\": rpc error: code = NotFound desc = could not find container \"c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354\": container with ID starting with c51e044d8f85dd7af471f5b5d2f6420d2fa0ffb598a7139c948f2fdc1c97c354 not found: ID does not exist" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.764951 4744 scope.go:117] "RemoveContainer" containerID="8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988" Mar 11 01:18:42 crc kubenswrapper[4744]: E0311 01:18:42.765261 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988\": container with ID starting with 8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988 not found: ID does not exist" containerID="8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.765301 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988"} err="failed to get container status \"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988\": rpc error: code = NotFound desc = could not find container \"8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988\": container with ID starting with 8f7469ee4f9958606072cd8b2e275f93ead748d0d2bcd05491f6cddb8511e988 not found: ID does not exist" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.765327 4744 scope.go:117] "RemoveContainer" containerID="eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.775876 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data" (OuterVolumeSpecName: "config-data") pod "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" (UID: "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.781642 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" (UID: "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.787136 4744 scope.go:117] "RemoveContainer" containerID="85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.827732 4744 scope.go:117] "RemoveContainer" containerID="eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04" Mar 11 01:18:42 crc kubenswrapper[4744]: E0311 01:18:42.830874 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04\": container with ID starting with eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04 not found: ID does not exist" containerID="eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.830885 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" (UID: "c5eee60b-3a4a-45b9-b25b-d3073e8e64f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.830907 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04"} err="failed to get container status \"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04\": rpc error: code = NotFound desc = could not find container \"eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04\": container with ID starting with eabf8cb5892c59b5607319c157da84e1c564f5113a8e0deb5471cf04d5265f04 not found: ID does not exist" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.830933 4744 scope.go:117] "RemoveContainer" containerID="85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276" Mar 11 01:18:42 crc kubenswrapper[4744]: E0311 01:18:42.831415 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276\": container with ID starting with 85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276 not found: ID does not exist" containerID="85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.831462 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276"} err="failed to get container status \"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276\": rpc error: code = NotFound desc = could not find container \"85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276\": container with ID starting with 85d053e33e94f64e212b16bf47d122be6ce781e756c1b75d39305bd8e7791276 not found: ID does not exist" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.856863 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.856885 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.856894 4744 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:42 crc kubenswrapper[4744]: I0311 01:18:42.856903 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbv6n\" (UniqueName: \"kubernetes.io/projected/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6-kube-api-access-dbv6n\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.062156 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.080623 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.101068 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.125633 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.193322 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: E0311 01:18:43.194065 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-log" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194078 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-log" Mar 11 01:18:43 crc kubenswrapper[4744]: E0311 01:18:43.194091 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-api" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194098 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-api" Mar 11 01:18:43 crc kubenswrapper[4744]: E0311 01:18:43.194122 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-log" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194128 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-log" Mar 11 01:18:43 crc kubenswrapper[4744]: E0311 01:18:43.194159 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-metadata" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194164 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-metadata" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194473 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-log" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194498 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-log" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194529 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" containerName="nova-api-api" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.194545 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" containerName="nova-metadata-metadata" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.195977 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.198534 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.198645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.212371 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.221623 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.230996 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.231090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.234409 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.234551 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.234557 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370397 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370485 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbw4k\" (UniqueName: \"kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370724 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370845 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.370868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6pmg\" (UniqueName: \"kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.473975 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.474184 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475554 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475603 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6pmg\" (UniqueName: \"kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475658 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbw4k\" (UniqueName: \"kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.475822 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.476398 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.476482 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.477280 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.477648 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.480666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.481486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.482272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.482978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.484090 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.490046 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.492984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.499788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6pmg\" (UniqueName: \"kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg\") pod \"nova-metadata-0\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.504649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbw4k\" (UniqueName: \"kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k\") pod \"nova-api-0\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.523134 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.546040 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.739337 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62ac51a-a222-4e7b-b465-9e71c3d34b1f","Type":"ContainerStarted","Data":"fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6"} Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.739661 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62ac51a-a222-4e7b-b465-9e71c3d34b1f","Type":"ContainerStarted","Data":"6344ff515d5d03e1c58a3b38592f19215b4ffbf6761bba85ef02c15059c6f620"} Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.771927 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.771906292 podStartE2EDuration="2.771906292s" podCreationTimestamp="2026-03-11 01:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:43.765314517 +0000 UTC m=+1480.569532112" watchObservedRunningTime="2026-03-11 01:18:43.771906292 +0000 UTC m=+1480.576123897" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.987106 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e3c2f5-67c1-4e20-ab17-99638fa00963" path="/var/lib/kubelet/pods/a8e3c2f5-67c1-4e20-ab17-99638fa00963/volumes" Mar 11 01:18:43 crc kubenswrapper[4744]: I0311 01:18:43.987969 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5eee60b-3a4a-45b9-b25b-d3073e8e64f6" path="/var/lib/kubelet/pods/c5eee60b-3a4a-45b9-b25b-d3073e8e64f6/volumes" Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.121258 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.136760 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.771091 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerStarted","Data":"d169bd1e2373ec6a70d9c8ea39075ed35856b33455affec0b49a8f185690d2f6"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.771401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerStarted","Data":"45fbeb5c4f5cda1cead9ba249d13f0bd4d1407bc16d3e8ba3521b8e0f14fca3f"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.771416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerStarted","Data":"45928c1c60e24f1cec1e772f91c0360d621398cc7ddaec6f9413dcb87f97c21b"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.773417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerStarted","Data":"55772af813a2f508c881ce1d4bcdd6c3b028b1ce12f20693da7faae23420077f"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.773463 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerStarted","Data":"0139bf0559237b374402a2f0ca5c12edd7eeaa5764a0ccb62bf6e9201da581b6"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.773478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerStarted","Data":"ad20406abed1d97d21a11032cbec2fef3caf71dbf971517a038b8a1a756e3205"} Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.787238 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.787205223 podStartE2EDuration="1.787205223s" podCreationTimestamp="2026-03-11 01:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:44.785779339 +0000 UTC m=+1481.589996944" watchObservedRunningTime="2026-03-11 01:18:44.787205223 +0000 UTC m=+1481.591422818" Mar 11 01:18:44 crc kubenswrapper[4744]: I0311 01:18:44.812829 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.812808005 podStartE2EDuration="1.812808005s" podCreationTimestamp="2026-03-11 01:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 01:18:44.806321264 +0000 UTC m=+1481.610538869" watchObservedRunningTime="2026-03-11 01:18:44.812808005 +0000 UTC m=+1481.617025610" Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.802982 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.806573 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.834919 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.922794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.922850 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4kd\" (UniqueName: \"kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:45 crc kubenswrapper[4744]: I0311 01:18:45.922912 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.025361 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.025418 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4kd\" (UniqueName: \"kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.025474 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.025869 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.026043 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.056542 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4kd\" (UniqueName: \"kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd\") pod \"certified-operators-8t6v4\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.134411 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:46 crc kubenswrapper[4744]: W0311 01:18:46.623120 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893dfdf0_66ea_493f_96c0_d3b7600c2f29.slice/crio-cc4915587d7f9b5f597eb68fb9db8e61460dd26ae1c0b1971cbfa832d26575b0 WatchSource:0}: Error finding container cc4915587d7f9b5f597eb68fb9db8e61460dd26ae1c0b1971cbfa832d26575b0: Status 404 returned error can't find the container with id cc4915587d7f9b5f597eb68fb9db8e61460dd26ae1c0b1971cbfa832d26575b0 Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.625310 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.821636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerStarted","Data":"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759"} Mar 11 01:18:46 crc kubenswrapper[4744]: I0311 01:18:46.821675 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerStarted","Data":"cc4915587d7f9b5f597eb68fb9db8e61460dd26ae1c0b1971cbfa832d26575b0"} Mar 11 01:18:47 crc kubenswrapper[4744]: I0311 01:18:47.134093 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 01:18:47 crc kubenswrapper[4744]: I0311 01:18:47.834739 4744 generic.go:334] "Generic (PLEG): container finished" podID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerID="072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759" exitCode=0 Mar 11 01:18:47 crc kubenswrapper[4744]: I0311 01:18:47.834857 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerDied","Data":"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759"} Mar 11 01:18:48 crc kubenswrapper[4744]: I0311 01:18:48.523447 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:18:48 crc kubenswrapper[4744]: I0311 01:18:48.523584 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 01:18:49 crc kubenswrapper[4744]: I0311 01:18:49.721565 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 11 01:18:49 crc kubenswrapper[4744]: I0311 01:18:49.858912 4744 generic.go:334] "Generic (PLEG): container finished" podID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerID="9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258" exitCode=0 Mar 11 01:18:49 crc kubenswrapper[4744]: I0311 01:18:49.859127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerDied","Data":"9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258"} Mar 11 01:18:50 crc kubenswrapper[4744]: I0311 01:18:50.870153 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerStarted","Data":"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2"} Mar 11 01:18:50 crc kubenswrapper[4744]: I0311 01:18:50.939230 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8t6v4" podStartSLOduration=3.528502519 podStartE2EDuration="5.939207876s" podCreationTimestamp="2026-03-11 01:18:45 +0000 UTC" firstStartedPulling="2026-03-11 01:18:47.836650847 +0000 UTC m=+1484.640868472" lastFinishedPulling="2026-03-11 01:18:50.247356214 +0000 UTC m=+1487.051573829" observedRunningTime="2026-03-11 01:18:50.927215534 +0000 UTC m=+1487.731433159" watchObservedRunningTime="2026-03-11 01:18:50.939207876 +0000 UTC m=+1487.743425491" Mar 11 01:18:52 crc kubenswrapper[4744]: I0311 01:18:52.134883 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 01:18:52 crc kubenswrapper[4744]: I0311 01:18:52.171560 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 01:18:52 crc kubenswrapper[4744]: I0311 01:18:52.942390 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 01:18:53 crc kubenswrapper[4744]: I0311 01:18:53.523921 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 01:18:53 crc kubenswrapper[4744]: I0311 01:18:53.523994 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 01:18:53 crc kubenswrapper[4744]: I0311 01:18:53.547354 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:53 crc kubenswrapper[4744]: I0311 01:18:53.547770 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.537760 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.537795 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.562667 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.562677 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.916256 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7268c12-3602-488a-8ef9-606bdf629c99" containerID="d029e25bc26b2d86f29d6c0a65d281983dda59b6ce3ee62f24f6b8bf2dac6563" exitCode=137 Mar 11 01:18:54 crc kubenswrapper[4744]: I0311 01:18:54.916311 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerDied","Data":"d029e25bc26b2d86f29d6c0a65d281983dda59b6ce3ee62f24f6b8bf2dac6563"} Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.377465 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523309 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8p8b\" (UniqueName: \"kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523342 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523497 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523642 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.523707 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle\") pod \"b7268c12-3602-488a-8ef9-606bdf629c99\" (UID: \"b7268c12-3602-488a-8ef9-606bdf629c99\") " Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.525455 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.525737 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.557457 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b" (OuterVolumeSpecName: "kube-api-access-l8p8b") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "kube-api-access-l8p8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.560834 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts" (OuterVolumeSpecName: "scripts") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.578971 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.602227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625581 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625613 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7268c12-3602-488a-8ef9-606bdf629c99-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625623 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625631 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8p8b\" (UniqueName: \"kubernetes.io/projected/b7268c12-3602-488a-8ef9-606bdf629c99-kube-api-access-l8p8b\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625640 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.625650 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.629853 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.664079 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data" (OuterVolumeSpecName: "config-data") pod "b7268c12-3602-488a-8ef9-606bdf629c99" (UID: "b7268c12-3602-488a-8ef9-606bdf629c99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.727088 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.727125 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7268c12-3602-488a-8ef9-606bdf629c99-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.934732 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7268c12-3602-488a-8ef9-606bdf629c99","Type":"ContainerDied","Data":"cb0a2054a5692fc75a0ec24bb05c20b763e3484d4f2527532b6e4394beb864dd"} Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.934823 4744 scope.go:117] "RemoveContainer" containerID="d029e25bc26b2d86f29d6c0a65d281983dda59b6ce3ee62f24f6b8bf2dac6563" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.935058 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:55 crc kubenswrapper[4744]: I0311 01:18:55.985869 4744 scope.go:117] "RemoveContainer" containerID="c2793377c0ca5ce05edccec9e4d03d1a1f0b2c742156faccf67df9cdf71392bc" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.007826 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.029424 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.039289 4744 scope.go:117] "RemoveContainer" containerID="a709b7f7989928cd941295bf2b17bba6771bd6e57e2b98bebfe372bab0638bfb" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.044392 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:56 crc kubenswrapper[4744]: E0311 01:18:56.045022 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-notification-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045052 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-notification-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: E0311 01:18:56.045084 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="sg-core" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045099 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="sg-core" Mar 11 01:18:56 crc kubenswrapper[4744]: E0311 01:18:56.045136 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="proxy-httpd" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045148 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="proxy-httpd" Mar 11 01:18:56 crc kubenswrapper[4744]: E0311 01:18:56.045167 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-central-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045179 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-central-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045541 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-central-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045568 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="proxy-httpd" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045616 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="sg-core" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.045635 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" containerName="ceilometer-notification-agent" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.048592 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.052100 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.052333 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.052893 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.054966 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.113760 4744 scope.go:117] "RemoveContainer" containerID="ed6bbcf4cf54a519f10c91d716632ee4fb9dcf21de0115d4e485910375920b05" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.136641 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.136696 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.137890 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbr42\" (UniqueName: \"kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.137934 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.137962 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.138010 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.138059 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.138125 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.138168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.138205 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.205567 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbr42\" (UniqueName: \"kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240757 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240793 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240865 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.240956 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.241060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.241476 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.241914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.246105 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.246737 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.247565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.247610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.250117 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.266105 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbr42\" (UniqueName: \"kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42\") pod \"ceilometer-0\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.415874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:18:56 crc kubenswrapper[4744]: I0311 01:18:56.935810 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:18:56 crc kubenswrapper[4744]: W0311 01:18:56.968532 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c0b5dd_192a_4fd2_bbaa_b483399724df.slice/crio-723b7d781dd11751a0df17474f1ed7775b80a5f78ad3d6efc2754e757cce5b54 WatchSource:0}: Error finding container 723b7d781dd11751a0df17474f1ed7775b80a5f78ad3d6efc2754e757cce5b54: Status 404 returned error can't find the container with id 723b7d781dd11751a0df17474f1ed7775b80a5f78ad3d6efc2754e757cce5b54 Mar 11 01:18:57 crc kubenswrapper[4744]: I0311 01:18:57.035983 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:57 crc kubenswrapper[4744]: I0311 01:18:57.113667 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:18:57 crc kubenswrapper[4744]: I0311 01:18:57.964758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerStarted","Data":"6940392b1cc755e200810e7dda5a0cdf602c2bc1bee93993b5d7b849c244decb"} Mar 11 01:18:57 crc kubenswrapper[4744]: I0311 01:18:57.965144 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerStarted","Data":"723b7d781dd11751a0df17474f1ed7775b80a5f78ad3d6efc2754e757cce5b54"} Mar 11 01:18:57 crc kubenswrapper[4744]: I0311 01:18:57.991065 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7268c12-3602-488a-8ef9-606bdf629c99" path="/var/lib/kubelet/pods/b7268c12-3602-488a-8ef9-606bdf629c99/volumes" Mar 11 01:18:58 crc kubenswrapper[4744]: I0311 01:18:58.990579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerStarted","Data":"1252cd0d658ee3e2e06be9771503f0ab0664f8d814b22e653a029a2d6d6716c4"} Mar 11 01:18:58 crc kubenswrapper[4744]: I0311 01:18:58.990737 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8t6v4" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="registry-server" containerID="cri-o://e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2" gracePeriod=2 Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.495165 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.613293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities\") pod \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.613349 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content\") pod \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.613421 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4kd\" (UniqueName: \"kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd\") pod \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\" (UID: \"893dfdf0-66ea-493f-96c0-d3b7600c2f29\") " Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.614547 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities" (OuterVolumeSpecName: "utilities") pod "893dfdf0-66ea-493f-96c0-d3b7600c2f29" (UID: "893dfdf0-66ea-493f-96c0-d3b7600c2f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.619297 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd" (OuterVolumeSpecName: "kube-api-access-qg4kd") pod "893dfdf0-66ea-493f-96c0-d3b7600c2f29" (UID: "893dfdf0-66ea-493f-96c0-d3b7600c2f29"). InnerVolumeSpecName "kube-api-access-qg4kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.694898 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "893dfdf0-66ea-493f-96c0-d3b7600c2f29" (UID: "893dfdf0-66ea-493f-96c0-d3b7600c2f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.715348 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.715395 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/893dfdf0-66ea-493f-96c0-d3b7600c2f29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:18:59 crc kubenswrapper[4744]: I0311 01:18:59.715414 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4kd\" (UniqueName: \"kubernetes.io/projected/893dfdf0-66ea-493f-96c0-d3b7600c2f29-kube-api-access-qg4kd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.005749 4744 generic.go:334] "Generic (PLEG): container finished" podID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerID="e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2" exitCode=0 Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.005847 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerDied","Data":"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2"} Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.005912 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8t6v4" event={"ID":"893dfdf0-66ea-493f-96c0-d3b7600c2f29","Type":"ContainerDied","Data":"cc4915587d7f9b5f597eb68fb9db8e61460dd26ae1c0b1971cbfa832d26575b0"} Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.005941 4744 scope.go:117] "RemoveContainer" containerID="e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.006119 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8t6v4" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.013085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerStarted","Data":"3d1ff7ea2ca1a4a7692ee87d6f5ba883dc249a72e783a373c386078d938baf27"} Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.052842 4744 scope.go:117] "RemoveContainer" containerID="9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.080244 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.091504 4744 scope.go:117] "RemoveContainer" containerID="072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.096058 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8t6v4"] Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.143893 4744 scope.go:117] "RemoveContainer" containerID="e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2" Mar 11 01:19:00 crc kubenswrapper[4744]: E0311 01:19:00.144460 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2\": container with ID starting with e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2 not found: ID does not exist" containerID="e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.144492 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2"} err="failed to get container status \"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2\": rpc error: code = NotFound desc = could not find container \"e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2\": container with ID starting with e8277b7598cd7acfdfc39b874aedea1bd9443b71ee0dfbbca9f082205cdc4fa2 not found: ID does not exist" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.144529 4744 scope.go:117] "RemoveContainer" containerID="9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258" Mar 11 01:19:00 crc kubenswrapper[4744]: E0311 01:19:00.145458 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258\": container with ID starting with 9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258 not found: ID does not exist" containerID="9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.145504 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258"} err="failed to get container status \"9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258\": rpc error: code = NotFound desc = could not find container \"9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258\": container with ID starting with 9571aed871c9082712882539dff0a4de76e948e81bcb7a9aa9bd78edc67bc258 not found: ID does not exist" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.145551 4744 scope.go:117] "RemoveContainer" containerID="072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759" Mar 11 01:19:00 crc kubenswrapper[4744]: E0311 01:19:00.146069 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759\": container with ID starting with 072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759 not found: ID does not exist" containerID="072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759" Mar 11 01:19:00 crc kubenswrapper[4744]: I0311 01:19:00.146162 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759"} err="failed to get container status \"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759\": rpc error: code = NotFound desc = could not find container \"072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759\": container with ID starting with 072199f6437095b34cc813fa157bd24ee274043696ab6e89cfb6111c53701759 not found: ID does not exist" Mar 11 01:19:00 crc kubenswrapper[4744]: E0311 01:19:00.256434 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893dfdf0_66ea_493f_96c0_d3b7600c2f29.slice\": RecentStats: unable to find data in memory cache]" Mar 11 01:19:01 crc kubenswrapper[4744]: I0311 01:19:01.996655 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" path="/var/lib/kubelet/pods/893dfdf0-66ea-493f-96c0-d3b7600c2f29/volumes" Mar 11 01:19:02 crc kubenswrapper[4744]: I0311 01:19:02.043484 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerStarted","Data":"2e90fe156899af91f60cc58374a4704b928fb93dc5a2b8a016f190e0a0897fe6"} Mar 11 01:19:02 crc kubenswrapper[4744]: I0311 01:19:02.043779 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 01:19:02 crc kubenswrapper[4744]: I0311 01:19:02.079836 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.958914541 podStartE2EDuration="7.079803604s" podCreationTimestamp="2026-03-11 01:18:55 +0000 UTC" firstStartedPulling="2026-03-11 01:18:56.974833136 +0000 UTC m=+1493.779050771" lastFinishedPulling="2026-03-11 01:19:01.095722219 +0000 UTC m=+1497.899939834" observedRunningTime="2026-03-11 01:19:02.068942838 +0000 UTC m=+1498.873160513" watchObservedRunningTime="2026-03-11 01:19:02.079803604 +0000 UTC m=+1498.884021279" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.540632 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.541089 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.555182 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.559451 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.560396 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.568312 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.577213 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 01:19:03 crc kubenswrapper[4744]: I0311 01:19:03.594464 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 01:19:04 crc kubenswrapper[4744]: I0311 01:19:04.066669 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 01:19:04 crc kubenswrapper[4744]: I0311 01:19:04.073153 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 01:19:12 crc kubenswrapper[4744]: I0311 01:19:12.409499 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:19:12 crc kubenswrapper[4744]: I0311 01:19:12.410224 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:19:26 crc kubenswrapper[4744]: I0311 01:19:26.431485 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.162843 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:34 crc kubenswrapper[4744]: E0311 01:19:34.164050 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="extract-utilities" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.164072 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="extract-utilities" Mar 11 01:19:34 crc kubenswrapper[4744]: E0311 01:19:34.164119 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="registry-server" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.164132 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="registry-server" Mar 11 01:19:34 crc kubenswrapper[4744]: E0311 01:19:34.164158 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="extract-content" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.164174 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="extract-content" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.164553 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="893dfdf0-66ea-493f-96c0-d3b7600c2f29" containerName="registry-server" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.166862 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.176323 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.260364 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sqhr\" (UniqueName: \"kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.260530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.260568 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.362392 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sqhr\" (UniqueName: \"kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.362725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.362828 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.363252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.363895 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.406098 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sqhr\" (UniqueName: \"kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr\") pod \"redhat-operators-qnrv8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: I0311 01:19:34.491727 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:34 crc kubenswrapper[4744]: W0311 01:19:34.991261 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b55bf1_d3fe_4ba9_bc5b_3ac4cf87e4f8.slice/crio-6791e2dbcf7890a4ea4e236b98c40266f3bf25278f27196841bea8157595304e WatchSource:0}: Error finding container 6791e2dbcf7890a4ea4e236b98c40266f3bf25278f27196841bea8157595304e: Status 404 returned error can't find the container with id 6791e2dbcf7890a4ea4e236b98c40266f3bf25278f27196841bea8157595304e Mar 11 01:19:35 crc kubenswrapper[4744]: I0311 01:19:35.000698 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:35 crc kubenswrapper[4744]: I0311 01:19:35.411069 4744 generic.go:334] "Generic (PLEG): container finished" podID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerID="6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc" exitCode=0 Mar 11 01:19:35 crc kubenswrapper[4744]: I0311 01:19:35.411142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerDied","Data":"6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc"} Mar 11 01:19:35 crc kubenswrapper[4744]: I0311 01:19:35.411333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerStarted","Data":"6791e2dbcf7890a4ea4e236b98c40266f3bf25278f27196841bea8157595304e"} Mar 11 01:19:35 crc kubenswrapper[4744]: I0311 01:19:35.413164 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:19:36 crc kubenswrapper[4744]: I0311 01:19:36.425033 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerStarted","Data":"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f"} Mar 11 01:19:37 crc kubenswrapper[4744]: I0311 01:19:37.442986 4744 generic.go:334] "Generic (PLEG): container finished" podID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerID="2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f" exitCode=0 Mar 11 01:19:37 crc kubenswrapper[4744]: I0311 01:19:37.443062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerDied","Data":"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f"} Mar 11 01:19:38 crc kubenswrapper[4744]: I0311 01:19:38.455846 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerStarted","Data":"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a"} Mar 11 01:19:38 crc kubenswrapper[4744]: I0311 01:19:38.482675 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnrv8" podStartSLOduration=1.9881780980000001 podStartE2EDuration="4.482646937s" podCreationTimestamp="2026-03-11 01:19:34 +0000 UTC" firstStartedPulling="2026-03-11 01:19:35.412935836 +0000 UTC m=+1532.217153431" lastFinishedPulling="2026-03-11 01:19:37.907404655 +0000 UTC m=+1534.711622270" observedRunningTime="2026-03-11 01:19:38.476414064 +0000 UTC m=+1535.280631709" watchObservedRunningTime="2026-03-11 01:19:38.482646937 +0000 UTC m=+1535.286864582" Mar 11 01:19:42 crc kubenswrapper[4744]: I0311 01:19:42.409315 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:19:42 crc kubenswrapper[4744]: I0311 01:19:42.410856 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:19:44 crc kubenswrapper[4744]: I0311 01:19:44.492741 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:44 crc kubenswrapper[4744]: I0311 01:19:44.493250 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:45 crc kubenswrapper[4744]: I0311 01:19:45.567007 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qnrv8" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="registry-server" probeResult="failure" output=< Mar 11 01:19:45 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:19:45 crc kubenswrapper[4744]: > Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.716682 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.719389 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.751203 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.805973 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.806038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.806101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n27k\" (UniqueName: \"kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.906542 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.906823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n27k\" (UniqueName: \"kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.906929 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.907211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.907269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:46 crc kubenswrapper[4744]: I0311 01:19:46.931483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n27k\" (UniqueName: \"kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k\") pod \"redhat-marketplace-mw9fv\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.049745 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.337347 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vw5rd"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.366635 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vw5rd"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.376347 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.377409 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.384789 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.414057 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.453585 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5453-account-create-update-69w9d"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.475488 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.475838 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" containerName="openstackclient" containerID="cri-o://fa97b985b022afa1a47a024d8d1193a6165049de48146766011f761fe39d7dce" gracePeriod=2 Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.533136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9b57\" (UniqueName: \"kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.533410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.546573 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5453-account-create-update-69w9d"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.606773 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.637479 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.637833 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9b57\" (UniqueName: \"kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.639017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.643600 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-95dc-account-create-update-q7hx6"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.652658 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-95dc-account-create-update-q7hx6"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.671863 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:47 crc kubenswrapper[4744]: E0311 01:19:47.672266 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" containerName="openstackclient" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.672277 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" containerName="openstackclient" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.672434 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" containerName="openstackclient" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.673077 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.687305 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9b57\" (UniqueName: \"kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57\") pod \"root-account-create-update-rplmm\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.699808 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.704894 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.789057 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.790276 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.793415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.794645 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.833172 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.851490 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.851731 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-nrcjs" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" containerName="openstack-network-exporter" containerID="cri-o://a76d8dbc969a328faa9315afa2eb3f3d73314211ba9392a01e5df03bb7391b1e" gracePeriod=30 Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.854716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.854766 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfc6b\" (UniqueName: \"kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.854824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jhv\" (UniqueName: \"kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.854889 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.877906 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.915331 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.921546 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.946217 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-876b-account-create-update-srkst"] Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.958859 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jhv\" (UniqueName: \"kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.958940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.959819 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.959869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfc6b\" (UniqueName: \"kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.960854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.961285 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:47 crc kubenswrapper[4744]: I0311 01:19:47.981677 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-876b-account-create-update-srkst"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.001331 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e05323b-e6d6-49f9-8cac-1fa036a98097" path="/var/lib/kubelet/pods/4e05323b-e6d6-49f9-8cac-1fa036a98097/volumes" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.001995 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526978e0-6809-4f61-863a-c7c1c54a7507" path="/var/lib/kubelet/pods/526978e0-6809-4f61-863a-c7c1c54a7507/volumes" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.005078 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55d174a-b4e5-4c03-a180-b93ba3d49f1a" path="/var/lib/kubelet/pods/c55d174a-b4e5-4c03-a180-b93ba3d49f1a/volumes" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.005777 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe2524a-0c58-4294-b9e2-640fe0b5d294" path="/var/lib/kubelet/pods/fbe2524a-0c58-4294-b9e2-640fe0b5d294/volumes" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.024729 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfc6b\" (UniqueName: \"kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b\") pod \"nova-api-95dc-account-create-update-x25mp\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.065026 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.067145 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.069399 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.075989 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.076055 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data podName:fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:48.576037364 +0000 UTC m=+1545.380254969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data") pod "rabbitmq-server-0" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9") : configmap "rabbitmq-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.085202 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jhv\" (UniqueName: \"kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv\") pod \"barbican-5453-account-create-update-q47sg\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.123943 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.124749 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.176038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.176372 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22rr\" (UniqueName: \"kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.189900 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.190148 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="ovn-northd" containerID="cri-o://4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" gracePeriod=30 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.190542 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="openstack-network-exporter" containerID="cri-o://e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b" gracePeriod=30 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.245998 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b08a-account-create-update-bqm5v"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.289089 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22rr\" (UniqueName: \"kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.289132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.290067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.297192 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b08a-account-create-update-bqm5v"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.338972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22rr\" (UniqueName: \"kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr\") pod \"neutron-b08a-account-create-update-7dvx8\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.373178 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.402295 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.438519 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-61e2-account-create-update-6wjlk"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.470913 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-61e2-account-create-update-6wjlk"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.496554 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e282-account-create-update-cq8mf"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.534600 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e282-account-create-update-cq8mf"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.574354 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.605083 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.605300 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data podName:714c91e5-04c5-4f95-97e3-a3c08664944d nodeName:}" failed. No retries permitted until 2026-03-11 01:19:49.105283542 +0000 UTC m=+1545.909501147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data") pod "rabbitmq-cell1-server-0" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d") : configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.607722 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.607864 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data podName:fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:49.607834462 +0000 UTC m=+1546.412052067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data") pod "rabbitmq-server-0" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9") : configmap "rabbitmq-config-data" not found Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.610195 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vkz2r"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.612597 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.633279 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vkz2r"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.658061 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pw8fv"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.669725 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerStarted","Data":"b9b29139c92bbef3351895f00577cbdfe29d8ac1a349dba92cd6e5c004a53080"} Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.671101 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pw8fv"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.688420 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b858-account-create-update-hl4l5"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.705866 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b858-account-create-update-hl4l5"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.719443 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-687vh"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.729362 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bd14-account-create-update-bhb66"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.750792 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bd14-account-create-update-bhb66"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.762870 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-687vh"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.769674 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hhwh7"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.781717 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nrcjs_ebd1c76c-75f8-411f-9350-a0e31f1721cd/openstack-network-exporter/0.log" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.781788 4744 generic.go:334] "Generic (PLEG): container finished" podID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" containerID="a76d8dbc969a328faa9315afa2eb3f3d73314211ba9392a01e5df03bb7391b1e" exitCode=2 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.781945 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrcjs" event={"ID":"ebd1c76c-75f8-411f-9350-a0e31f1721cd","Type":"ContainerDied","Data":"a76d8dbc969a328faa9315afa2eb3f3d73314211ba9392a01e5df03bb7391b1e"} Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.791885 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hhwh7"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.821670 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g9cnb"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.833668 4744 generic.go:334] "Generic (PLEG): container finished" podID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerID="e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b" exitCode=2 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.833717 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerDied","Data":"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b"} Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.849279 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g9cnb"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.857525 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.859185 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57db588689-85hjj" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="dnsmasq-dns" containerID="cri-o://7d9f021a66654b3256ddc109f1f5944e4eaa779a0ef70e35b1d3260089fa29e0" gracePeriod=10 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.930051 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hqbzq"] Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.930210 4744 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-2mjl7" message=< Mar 11 01:19:48 crc kubenswrapper[4744]: Exiting ovn-controller (1) [ OK ] Mar 11 01:19:48 crc kubenswrapper[4744]: > Mar 11 01:19:48 crc kubenswrapper[4744]: E0311 01:19:48.930241 4744 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-2mjl7" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" containerID="cri-o://50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d" Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.930271 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-2mjl7" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" containerID="cri-o://50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d" gracePeriod=29 Mar 11 01:19:48 crc kubenswrapper[4744]: I0311 01:19:48.948079 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hqbzq"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.075293 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-qkr98"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.091597 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-qkr98"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.107100 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.107579 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="openstack-network-exporter" containerID="cri-o://7fd05cd78a9ef3fd32c5308b2e3464c3c04f1b11a481396b59397efe66dc1c90" gracePeriod=300 Mar 11 01:19:49 crc kubenswrapper[4744]: E0311 01:19:49.122004 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:49 crc kubenswrapper[4744]: E0311 01:19:49.122083 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data podName:714c91e5-04c5-4f95-97e3-a3c08664944d nodeName:}" failed. No retries permitted until 2026-03-11 01:19:50.122065036 +0000 UTC m=+1546.926282641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data") pod "rabbitmq-cell1-server-0" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d") : configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.203809 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="ovsdbserver-nb" containerID="cri-o://9d84530f46578e0e43683833f9c6f2305fe1b6a881fac74e31a0651183712b14" gracePeriod=300 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.216555 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vgn45"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.241840 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vgn45"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.357742 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.359315 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="openstack-network-exporter" containerID="cri-o://223e84951f31263619072ddbe2e0f719244b2be18159a6ed4fd2baa8d2a4de50" gracePeriod=300 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.373504 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nrcjs_ebd1c76c-75f8-411f-9350-a0e31f1721cd/openstack-network-exporter/0.log" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.373727 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.376703 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377123 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-server" containerID="cri-o://b78b35fad65a6104f3b70ff15a556ecab18834b6ed00582485d71f455ffb4854" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377226 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="swift-recon-cron" containerID="cri-o://25e3543d3b3d14862a73ba6ffdaeec7e7e8cb26ca742becd4587bae22c3b8432" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377265 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="rsync" containerID="cri-o://5bbb97d3f04c59bd0734d58c134b140d749bfb617a14047f4d2be2a03bcf9bd5" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377295 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-expirer" containerID="cri-o://57f15e76688f17356669b85c46315dd0dffc814d43f7f2c52af90d0784301949" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377344 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-updater" containerID="cri-o://683071c11cad26f904a5f7a36fcb33b138ae8c1681f0a7ff0d1c59caa91adfa5" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377375 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-auditor" containerID="cri-o://7325adf5421afd7c3a21ffc84f42f4176496bf0df0f8bbb48940ea537474b52d" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377401 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-replicator" containerID="cri-o://1dc3103a150112ffa802284194bdc5ad25c73127fe6b5ddb013e5409a1028b69" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377447 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-server" containerID="cri-o://98bb403c28de1a3a422f752fd836eeaf91ab8123e3a1415917dfa6a935d3dad7" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377480 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-updater" containerID="cri-o://7d4d68ba9b886d9742d463d33fa1cc87d5cbb6630ca1df77c63ccccdfc56e184" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.377522 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-auditor" containerID="cri-o://2a2081a399521c0c43979716a406e0e99df32e512b18971fd342f84cf6c0c784" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.378050 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-replicator" containerID="cri-o://2b4eef62494e9560a6468f7258be6cbee2afabc30627c4cd424f6256bf882bd9" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.378081 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-server" containerID="cri-o://d0f959b12a9512cb3f5a0776eb000de08da42f09e11825b7a95d1b85cdeb9533" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.378130 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-reaper" containerID="cri-o://657db772f7e222c18d1d14b7b5c9643c0ec7e79c4adc1d26309d817f501de327" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.378159 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-auditor" containerID="cri-o://0520adffd7bb3fd0a12c9f2003a6119d39241234f65942bbd93b143144e91dff" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.378185 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-replicator" containerID="cri-o://661223f28b0201765a8971851fdcdfc8ce86ba64df01908f6086c416839db484" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.393779 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.394026 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5959cf6645-bcjjf" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-api" containerID="cri-o://b1b1e7e9a3f9e195c5c8ffc0f9ba222b2dd152ff67cad9587e2f46e9f7c8f240" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.394174 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5959cf6645-bcjjf" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-httpd" containerID="cri-o://90d7864d9f44afb54c45c8f83816565799a968b37e50716fa7d940d17e944838" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.426926 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.442817 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="ovsdbserver-sb" containerID="cri-o://21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" gracePeriod=300 Mar 11 01:19:49 crc kubenswrapper[4744]: W0311 01:19:49.453875 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3aa5cc_eae2_4d60_96cf_6d847a5599ec.slice/crio-68f27e12070e3e9f4e8ddf9e7621e4bd7c6c7dea2fa37e86cf38abbb967764f7 WatchSource:0}: Error finding container 68f27e12070e3e9f4e8ddf9e7621e4bd7c6c7dea2fa37e86cf38abbb967764f7: Status 404 returned error can't find the container with id 68f27e12070e3e9f4e8ddf9e7621e4bd7c6c7dea2fa37e86cf38abbb967764f7 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.458938 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.459020 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.459102 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh7rw\" (UniqueName: \"kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.459137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.459206 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.459280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config\") pod \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\" (UID: \"ebd1c76c-75f8-411f-9350-a0e31f1721cd\") " Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.463320 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.463411 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.464374 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config" (OuterVolumeSpecName: "config") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.475057 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw" (OuterVolumeSpecName: "kube-api-access-zh7rw") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "kube-api-access-zh7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.480690 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.480979 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-log" containerID="cri-o://dda45a2a6beb4d51c68422dc1406aaa80a32f7291668639a011f39ec42b2fc96" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.481374 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-httpd" containerID="cri-o://5be8527746802ca45e3d648c31b7d6bd21a5227a4e2fd3aeb4816ee96bea245a" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.483693 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" containerID="cri-o://acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" gracePeriod=29 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.518018 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.519539 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" containerID="cri-o://55772af813a2f508c881ce1d4bcdd6c3b028b1ce12f20693da7faae23420077f" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.519230 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" containerID="cri-o://0139bf0559237b374402a2f0ca5c12edd7eeaa5764a0ccb62bf6e9201da581b6" gracePeriod=30 Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.525898 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.539399 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rrtn9"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.553482 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.565183 4744 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.565331 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd1c76c-75f8-411f-9350-a0e31f1721cd-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.565413 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.565495 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ebd1c76c-75f8-411f-9350-a0e31f1721cd-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:49 crc kubenswrapper[4744]: I0311 01:19:49.565804 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh7rw\" (UniqueName: \"kubernetes.io/projected/ebd1c76c-75f8-411f-9350-a0e31f1721cd-kube-api-access-zh7rw\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.571762 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rrtn9"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.589727 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.600463 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.600835 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-log" containerID="cri-o://45fbeb5c4f5cda1cead9ba249d13f0bd4d1407bc16d3e8ba3521b8e0f14fca3f" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.601365 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-api" containerID="cri-o://d169bd1e2373ec6a70d9c8ea39075ed35856b33455affec0b49a8f185690d2f6" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:49.630989 4744 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 11 01:19:50 crc kubenswrapper[4744]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 11 01:19:50 crc kubenswrapper[4744]: + source /usr/local/bin/container-scripts/functions Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNBridge=br-int Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNRemote=tcp:localhost:6642 Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNEncapType=geneve Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNAvailabilityZones= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ EnableChassisAsGateway=true Mar 11 01:19:50 crc kubenswrapper[4744]: ++ PhysicalNetworks= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNHostName= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 11 01:19:50 crc kubenswrapper[4744]: ++ ovs_dir=/var/lib/openvswitch Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 11 01:19:50 crc kubenswrapper[4744]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + cleanup_ovsdb_server_semaphore Mar 11 01:19:50 crc kubenswrapper[4744]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 11 01:19:50 crc kubenswrapper[4744]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-88ffp" message=< Mar 11 01:19:50 crc kubenswrapper[4744]: Exiting ovsdb-server (5) [ OK ] Mar 11 01:19:50 crc kubenswrapper[4744]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 11 01:19:50 crc kubenswrapper[4744]: + source /usr/local/bin/container-scripts/functions Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNBridge=br-int Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNRemote=tcp:localhost:6642 Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNEncapType=geneve Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNAvailabilityZones= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ EnableChassisAsGateway=true Mar 11 01:19:50 crc kubenswrapper[4744]: ++ PhysicalNetworks= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNHostName= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 11 01:19:50 crc kubenswrapper[4744]: ++ ovs_dir=/var/lib/openvswitch Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 11 01:19:50 crc kubenswrapper[4744]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + cleanup_ovsdb_server_semaphore Mar 11 01:19:50 crc kubenswrapper[4744]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 11 01:19:50 crc kubenswrapper[4744]: > Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:49.631028 4744 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 11 01:19:50 crc kubenswrapper[4744]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 11 01:19:50 crc kubenswrapper[4744]: + source /usr/local/bin/container-scripts/functions Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNBridge=br-int Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNRemote=tcp:localhost:6642 Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNEncapType=geneve Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNAvailabilityZones= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ EnableChassisAsGateway=true Mar 11 01:19:50 crc kubenswrapper[4744]: ++ PhysicalNetworks= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ OVNHostName= Mar 11 01:19:50 crc kubenswrapper[4744]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 11 01:19:50 crc kubenswrapper[4744]: ++ ovs_dir=/var/lib/openvswitch Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 11 01:19:50 crc kubenswrapper[4744]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 11 01:19:50 crc kubenswrapper[4744]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + sleep 0.5 Mar 11 01:19:50 crc kubenswrapper[4744]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 11 01:19:50 crc kubenswrapper[4744]: + cleanup_ovsdb_server_semaphore Mar 11 01:19:50 crc kubenswrapper[4744]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 11 01:19:50 crc kubenswrapper[4744]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 11 01:19:50 crc kubenswrapper[4744]: > pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" containerID="cri-o://376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.631074 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" containerID="cri-o://376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" gracePeriod=29 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.632645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ebd1c76c-75f8-411f-9350-a0e31f1721cd" (UID: "ebd1c76c-75f8-411f-9350-a0e31f1721cd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.639651 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k9frb"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.656547 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k9frb"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.671414 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebd1c76c-75f8-411f-9350-a0e31f1721cd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:49.671497 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:49.671559 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data podName:fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:51.671541762 +0000 UTC m=+1548.475759367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data") pod "rabbitmq-server-0" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9") : configmap "rabbitmq-config-data" not found Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.699551 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.699787 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-log" containerID="cri-o://160c31ddfbf19b31394b583802d9b0b99a645d4e25dc64bc9887035b9c0eac27" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.700155 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-httpd" containerID="cri-o://42a9b4781df7a35e2423fc3e40c26d692ed2ab1efd0387967762e554fa2952fa" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.722162 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kh4gc"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.752406 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kh4gc"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.773221 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.773767 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="cinder-scheduler" containerID="cri-o://e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.774195 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="probe" containerID="cri-o://8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.800639 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qnfhv"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.808768 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qnfhv"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.859978 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.860210 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api-log" containerID="cri-o://9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.860725 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api" containerID="cri-o://c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.892013 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1ca73c89-992f-4b36-9a70-5d67bace9cd2/ovsdbserver-sb/0.log" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.892050 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerID="223e84951f31263619072ddbe2e0f719244b2be18159a6ed4fd2baa8d2a4de50" exitCode=2 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.892066 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerID="21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" exitCode=143 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.892130 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerDied","Data":"223e84951f31263619072ddbe2e0f719244b2be18159a6ed4fd2baa8d2a4de50"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.892156 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerDied","Data":"21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.905758 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.928327 4744 generic.go:334] "Generic (PLEG): container finished" podID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" containerID="fa97b985b022afa1a47a024d8d1193a6165049de48146766011f761fe39d7dce" exitCode=137 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.931702 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nrcjs_ebd1c76c-75f8-411f-9350-a0e31f1721cd/openstack-network-exporter/0.log" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.931753 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrcjs" event={"ID":"ebd1c76c-75f8-411f-9350-a0e31f1721cd","Type":"ContainerDied","Data":"fe98758294a82d15794b05005d068b1924d78b40d0c6f694c099b78f635c4589"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.931785 4744 scope.go:117] "RemoveContainer" containerID="a76d8dbc969a328faa9315afa2eb3f3d73314211ba9392a01e5df03bb7391b1e" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.938629 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.942398 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.942638 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f78d57d44-gt8df" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-log" containerID="cri-o://8485a4680d7e9bd7479321ad2c60fbdd63c7941a99f12f5159677293b38ca6db" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.942746 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f78d57d44-gt8df" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-api" containerID="cri-o://d932b2416d71a86f80ef3581b5216ce6ba10ab543bf647b40daeebe0c83edbaa" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.964452 4744 generic.go:334] "Generic (PLEG): container finished" podID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerID="160c31ddfbf19b31394b583802d9b0b99a645d4e25dc64bc9887035b9c0eac27" exitCode=143 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.964693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerDied","Data":"160c31ddfbf19b31394b583802d9b0b99a645d4e25dc64bc9887035b9c0eac27"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.965906 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hcnrn"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.969687 4744 generic.go:334] "Generic (PLEG): container finished" podID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerID="0139bf0559237b374402a2f0ca5c12edd7eeaa5764a0ccb62bf6e9201da581b6" exitCode=143 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.969741 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerDied","Data":"0139bf0559237b374402a2f0ca5c12edd7eeaa5764a0ccb62bf6e9201da581b6"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.972910 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.972968 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerDied","Data":"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:49.984183 4744 generic.go:334] "Generic (PLEG): container finished" podID="e910960b-a434-4830-b4be-96571fa4dd54" containerID="c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.008143 4744 generic.go:334] "Generic (PLEG): container finished" podID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerID="45fbeb5c4f5cda1cead9ba249d13f0bd4d1407bc16d3e8ba3521b8e0f14fca3f" exitCode=143 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.058591 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="57f15e76688f17356669b85c46315dd0dffc814d43f7f2c52af90d0784301949" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.059279 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="683071c11cad26f904a5f7a36fcb33b138ae8c1681f0a7ff0d1c59caa91adfa5" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.059292 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="7325adf5421afd7c3a21ffc84f42f4176496bf0df0f8bbb48940ea537474b52d" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.059353 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="1dc3103a150112ffa802284194bdc5ad25c73127fe6b5ddb013e5409a1028b69" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.059361 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="7d4d68ba9b886d9742d463d33fa1cc87d5cbb6630ca1df77c63ccccdfc56e184" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.061128 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="2a2081a399521c0c43979716a406e0e99df32e512b18971fd342f84cf6c0c784" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.073616 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="2b4eef62494e9560a6468f7258be6cbee2afabc30627c4cd424f6256bf882bd9" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.073650 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="657db772f7e222c18d1d14b7b5c9643c0ec7e79c4adc1d26309d817f501de327" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.073658 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="0520adffd7bb3fd0a12c9f2003a6119d39241234f65942bbd93b143144e91dff" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.073664 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="661223f28b0201765a8971851fdcdfc8ce86ba64df01908f6086c416839db484" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.120962 4744 generic.go:334] "Generic (PLEG): container finished" podID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerID="50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.139868 4744 generic.go:334] "Generic (PLEG): container finished" podID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerID="7d9f021a66654b3256ddc109f1f5944e4eaa779a0ef70e35b1d3260089fa29e0" exitCode=0 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.166859 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_97c4f09f-eb97-40a0-b06c-80a5a922c986/ovsdbserver-nb/0.log" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.166895 4744 generic.go:334] "Generic (PLEG): container finished" podID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerID="7fd05cd78a9ef3fd32c5308b2e3464c3c04f1b11a481396b59397efe66dc1c90" exitCode=2 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.166912 4744 generic.go:334] "Generic (PLEG): container finished" podID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerID="9d84530f46578e0e43683833f9c6f2305fe1b6a881fac74e31a0651183712b14" exitCode=143 Mar 11 01:19:50 crc kubenswrapper[4744]: W0311 01:19:50.207420 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b9e96a_e188_4f50_b2a8_95729300c2d3.slice/crio-faf2e6fd8a6d74abf8367960f261bea53496305814e7976ff1fa97c8962adb63 WatchSource:0}: Error finding container faf2e6fd8a6d74abf8367960f261bea53496305814e7976ff1fa97c8962adb63: Status 404 returned error can't find the container with id faf2e6fd8a6d74abf8367960f261bea53496305814e7976ff1fa97c8962adb63 Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.208085 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.208141 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data podName:714c91e5-04c5-4f95-97e3-a3c08664944d nodeName:}" failed. No retries permitted until 2026-03-11 01:19:52.208127068 +0000 UTC m=+1549.012344673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data") pod "rabbitmq-cell1-server-0" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d") : configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.211229 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 01:19:50 crc kubenswrapper[4744]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: if [ -n "barbican" ]; then Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="barbican" Mar 11 01:19:50 crc kubenswrapper[4744]: else Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="*" Mar 11 01:19:50 crc kubenswrapper[4744]: fi Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: # going for maximum compatibility here: Mar 11 01:19:50 crc kubenswrapper[4744]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 01:19:50 crc kubenswrapper[4744]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 01:19:50 crc kubenswrapper[4744]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 01:19:50 crc kubenswrapper[4744]: # support updates Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: $MYSQL_CMD < logger="UnhandledError" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.217948 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-5453-account-create-update-q47sg" podUID="4ed881ae-cd59-4830-99e9-34ed7708ed83" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.228116 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="galera" containerID="cri-o://58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.241302 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 01:19:50 crc kubenswrapper[4744]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: if [ -n "nova_api" ]; then Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="nova_api" Mar 11 01:19:50 crc kubenswrapper[4744]: else Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="*" Mar 11 01:19:50 crc kubenswrapper[4744]: fi Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: # going for maximum compatibility here: Mar 11 01:19:50 crc kubenswrapper[4744]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 01:19:50 crc kubenswrapper[4744]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 01:19:50 crc kubenswrapper[4744]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 01:19:50 crc kubenswrapper[4744]: # support updates Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: $MYSQL_CMD < logger="UnhandledError" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.243300 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-95dc-account-create-update-x25mp" podUID="d8b9e96a-e188-4f50-b2a8-95729300c2d3" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.251785 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbc16eb-59fb-4814-b3b7-944573b75d23" path="/var/lib/kubelet/pods/0fbc16eb-59fb-4814-b3b7-944573b75d23/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.252536 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d64d92-7649-4a5e-af99-a92a08b47ecf" path="/var/lib/kubelet/pods/20d64d92-7649-4a5e-af99-a92a08b47ecf/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.253101 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3946cf42-7442-4fad-b561-2050f9d26d8f" path="/var/lib/kubelet/pods/3946cf42-7442-4fad-b561-2050f9d26d8f/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.272700 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3ecb10-c01d-44af-9dc1-18a83c479f37" path="/var/lib/kubelet/pods/3a3ecb10-c01d-44af-9dc1-18a83c479f37/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.273524 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4891eca2-6c40-4d64-a625-23217932094a" path="/var/lib/kubelet/pods/4891eca2-6c40-4d64-a625-23217932094a/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.274033 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df37d98-3dbc-4977-add0-525bda3d679b" path="/var/lib/kubelet/pods/5df37d98-3dbc-4977-add0-525bda3d679b/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.274692 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64205e5d-2853-49f7-9928-8362fc9210ea" path="/var/lib/kubelet/pods/64205e5d-2853-49f7-9928-8362fc9210ea/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.299219 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74059d02-5e86-4b55-835e-b9dec89b45d3" path="/var/lib/kubelet/pods/74059d02-5e86-4b55-835e-b9dec89b45d3/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.299751 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86af3abb-ed19-4d12-9eb1-da0f54a41fcc" path="/var/lib/kubelet/pods/86af3abb-ed19-4d12-9eb1-da0f54a41fcc/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.300263 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8" path="/var/lib/kubelet/pods/93f326d0-b9cf-4124-b5e1-ce8e1c50b9c8/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.318650 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958fb132-39c9-4989-a322-8691247f7b22" path="/var/lib/kubelet/pods/958fb132-39c9-4989-a322-8691247f7b22/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.319500 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09baca5-8198-404f-8b8d-8f58db34f975" path="/var/lib/kubelet/pods/b09baca5-8198-404f-8b8d-8f58db34f975/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.320051 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a44cf2-a3b1-4b65-aba3-b0a5d939b303" path="/var/lib/kubelet/pods/b2a44cf2-a3b1-4b65-aba3-b0a5d939b303/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.320556 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d138bcef-d88d-4af3-8f41-f4804e583670" path="/var/lib/kubelet/pods/d138bcef-d88d-4af3-8f41-f4804e583670/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.345016 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e530fb58-e471-4681-8602-6218f09b0c04" path="/var/lib/kubelet/pods/e530fb58-e471-4681-8602-6218f09b0c04/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.345646 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec0bae40-f9bc-4bc7-81e3-684d3f8a6512" path="/var/lib/kubelet/pods/ec0bae40-f9bc-4bc7-81e3-684d3f8a6512/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.346265 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b" path="/var/lib/kubelet/pods/f1b26cd4-ca83-4f3a-ac9d-2dbfd1e6de1b/volumes" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.351962 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4 is running failed: container process not found" containerID="21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.352446 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4 is running failed: container process not found" containerID="21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.352798 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4 is running failed: container process not found" containerID="21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.352829 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="ovsdbserver-sb" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerDied","Data":"c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372716 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hcnrn"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372739 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6q4fk"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372759 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerDied","Data":"45fbeb5c4f5cda1cead9ba249d13f0bd4d1407bc16d3e8ba3521b8e0f14fca3f"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372775 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6q4fk"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372787 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"57f15e76688f17356669b85c46315dd0dffc814d43f7f2c52af90d0784301949"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"683071c11cad26f904a5f7a36fcb33b138ae8c1681f0a7ff0d1c59caa91adfa5"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372821 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-45rmx"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372842 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-45rmx"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372854 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372866 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372878 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"7325adf5421afd7c3a21ffc84f42f4176496bf0df0f8bbb48940ea537474b52d"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372895 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"1dc3103a150112ffa802284194bdc5ad25c73127fe6b5ddb013e5409a1028b69"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372905 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tld22"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372916 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"7d4d68ba9b886d9742d463d33fa1cc87d5cbb6630ca1df77c63ccccdfc56e184"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372926 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"2a2081a399521c0c43979716a406e0e99df32e512b18971fd342f84cf6c0c784"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372956 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372974 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tld22"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372985 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.372999 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"2b4eef62494e9560a6468f7258be6cbee2afabc30627c4cd424f6256bf882bd9"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373014 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373026 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"657db772f7e222c18d1d14b7b5c9643c0ec7e79c4adc1d26309d817f501de327"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373048 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"0520adffd7bb3fd0a12c9f2003a6119d39241234f65942bbd93b143144e91dff"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373057 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"661223f28b0201765a8971851fdcdfc8ce86ba64df01908f6086c416839db484"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7" event={"ID":"fe2603a1-fdea-44d4-8188-f5f93324575c","Type":"ContainerDied","Data":"50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-85hjj" event={"ID":"a90e479a-2c1d-4a55-9f51-eadbc3c0b333","Type":"ContainerDied","Data":"7d9f021a66654b3256ddc109f1f5944e4eaa779a0ef70e35b1d3260089fa29e0"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rplmm" event={"ID":"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec","Type":"ContainerStarted","Data":"68f27e12070e3e9f4e8ddf9e7621e4bd7c6c7dea2fa37e86cf38abbb967764f7"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerDied","Data":"7fd05cd78a9ef3fd32c5308b2e3464c3c04f1b11a481396b59397efe66dc1c90"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373131 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerDied","Data":"9d84530f46578e0e43683833f9c6f2305fe1b6a881fac74e31a0651183712b14"} Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.373361 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="755410bb-361b-47e2-8a7a-317119eec983" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fd95bdd6dfcd2af9ddc8f3dd4babec6abaf1feb9cf63d194943025c2b4b217ae" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.377004 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-httpd" containerID="cri-o://915179a7516d9702ba0c2f51a83fdc1a69ce876edb13d10542894dba2430fe71" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.377454 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-server" containerID="cri-o://9907e92d1fac188fb7c44fa1681a08073a561732eeaba685269da05b1fbc9c68" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.379252 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener" containerID="cri-o://0a9c241eccc912eedc93ca498ce49b3438428bba9d63396a61de516e8822d3fa" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.379891 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener-log" containerID="cri-o://eb1793ab7701df1a2087e9693648ed7fc1c77b3aadb94a5a3e17509e9cd8767b" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.382964 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c68976bb4-299gh" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api-log" containerID="cri-o://eb9d234dbc5c2b62e281936ae9110a93aa09d1211656f0987d1551eeec652ae8" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.383595 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c68976bb4-299gh" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api" containerID="cri-o://347f55903065cc04c7bddba453ffd2605f2f89f0b3744727e7f308c9383fd8e9" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.383803 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-bc7567ff7-gl658" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker" containerID="cri-o://9a1f61db53cc92beec4b876c5f655cbd7b0389b0b62b4ff5bc3cc4d0c15ec01a" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.390560 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.383190 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-bc7567ff7-gl658" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker-log" containerID="cri-o://0f15de8f909facf4d53d7ef48aa1e9dab867da2d0c8b16f8059f69e45a208436" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.431842 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.471961 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="rabbitmq" containerID="cri-o://f517c72839363553e8b786dec7b9824c28bc5f5e37822956cd45b454cd30e224" gracePeriod=604800 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.481784 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.486429 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="rabbitmq" containerID="cri-o://2d7e9342156b6a7e0b5782247ec6e299cdb60a6da7997fe5146c00f779c615e6" gracePeriod=604800 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.520991 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.521119 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qxqx\" (UniqueName: \"kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.521168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.521225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.521313 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.521356 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0\") pod \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\" (UID: \"a90e479a-2c1d-4a55-9f51-eadbc3c0b333\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.565741 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx" (OuterVolumeSpecName: "kube-api-access-8qxqx") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "kube-api-access-8qxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.581110 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.581305 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.607681 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.620584 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-49nrf"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.630286 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qxqx\" (UniqueName: \"kubernetes.io/projected/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-kube-api-access-8qxqx\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.661481 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.670426 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.670692 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.673590 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-49nrf"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.677133 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 01:19:50 crc kubenswrapper[4744]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: if [ -n "neutron" ]; then Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="neutron" Mar 11 01:19:50 crc kubenswrapper[4744]: else Mar 11 01:19:50 crc kubenswrapper[4744]: GRANT_DATABASE="*" Mar 11 01:19:50 crc kubenswrapper[4744]: fi Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: # going for maximum compatibility here: Mar 11 01:19:50 crc kubenswrapper[4744]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 01:19:50 crc kubenswrapper[4744]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 01:19:50 crc kubenswrapper[4744]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 01:19:50 crc kubenswrapper[4744]: # support updates Mar 11 01:19:50 crc kubenswrapper[4744]: Mar 11 01:19:50 crc kubenswrapper[4744]: $MYSQL_CMD < logger="UnhandledError" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.678205 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-b08a-account-create-update-7dvx8" podUID="20141525-666b-4609-b145-d38380a5d7c7" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.684251 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-knwz7"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.714674 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.714876 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.731911 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.734039 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.734437 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_97c4f09f-eb97-40a0-b06c-80a5a922c986/ovsdbserver-nb/0.log" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.734477 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.735554 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.740198 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 01:19:50 crc kubenswrapper[4744]: E0311 01:19:50.740240 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="galera" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.762576 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-knwz7"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.772689 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.773021 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" containerID="cri-o://fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" gracePeriod=30 Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.795579 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.803730 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.818380 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config" (OuterVolumeSpecName: "config") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.824271 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1ca73c89-992f-4b36-9a70-5d67bace9cd2/ovsdbserver-sb/0.log" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.824344 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.824886 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.824947 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832350 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832392 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832423 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832469 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832502 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vth9r\" (UniqueName: \"kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r\") pod \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832558 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret\") pod \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgd8d\" (UniqueName: \"kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832594 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle\") pod \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832631 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832656 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config\") pod \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\" (UID: \"35ec702b-4aa6-4fa6-a770-ec3caf762d5f\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832675 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.832713 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config\") pod \"97c4f09f-eb97-40a0-b06c-80a5a922c986\" (UID: \"97c4f09f-eb97-40a0-b06c-80a5a922c986\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.833133 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.833150 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.833160 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.836835 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config" (OuterVolumeSpecName: "config") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.842151 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.844538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts" (OuterVolumeSpecName: "scripts") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.916198 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.923778 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d" (OuterVolumeSpecName: "kube-api-access-kgd8d") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "kube-api-access-kgd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.926007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r" (OuterVolumeSpecName: "kube-api-access-vth9r") pod "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" (UID: "35ec702b-4aa6-4fa6-a770-ec3caf762d5f"). InnerVolumeSpecName "kube-api-access-vth9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935183 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935290 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkcdj\" (UniqueName: \"kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935340 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935383 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935393 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run" (OuterVolumeSpecName: "var-run") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935457 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gj26\" (UniqueName: \"kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935500 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn\") pod \"fe2603a1-fdea-44d4-8188-f5f93324575c\" (UID: \"fe2603a1-fdea-44d4-8188-f5f93324575c\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935640 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935665 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935746 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs\") pod \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\" (UID: \"1ca73c89-992f-4b36-9a70-5d67bace9cd2\") " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.935840 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts" (OuterVolumeSpecName: "scripts") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936166 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936185 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vth9r\" (UniqueName: \"kubernetes.io/projected/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-kube-api-access-vth9r\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936197 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgd8d\" (UniqueName: \"kubernetes.io/projected/97c4f09f-eb97-40a0-b06c-80a5a922c986-kube-api-access-kgd8d\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936205 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936213 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97c4f09f-eb97-40a0-b06c-80a5a922c986-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936221 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936229 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.936246 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.938046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts" (OuterVolumeSpecName: "scripts") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.938164 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.939293 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config" (OuterVolumeSpecName: "config") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.940624 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.941023 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.972769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26" (OuterVolumeSpecName: "kube-api-access-6gj26") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "kube-api-access-6gj26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.987620 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj" (OuterVolumeSpecName: "kube-api-access-mkcdj") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "kube-api-access-mkcdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:50 crc kubenswrapper[4744]: I0311 01:19:50.996793 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.037991 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ca73c89-992f-4b36-9a70-5d67bace9cd2-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038166 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2603a1-fdea-44d4-8188-f5f93324575c-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038244 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gj26\" (UniqueName: \"kubernetes.io/projected/1ca73c89-992f-4b36-9a70-5d67bace9cd2-kube-api-access-6gj26\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038303 4744 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038354 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038414 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038467 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkcdj\" (UniqueName: \"kubernetes.io/projected/fe2603a1-fdea-44d4-8188-f5f93324575c-kube-api-access-mkcdj\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.038538 4744 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe2603a1-fdea-44d4-8188-f5f93324575c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.064174 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.070582 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.085760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.097271 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.124906 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" (UID: "35ec702b-4aa6-4fa6-a770-ec3caf762d5f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.140058 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.140239 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.140312 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.140492 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.140569 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.149360 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" (UID: "35ec702b-4aa6-4fa6-a770-ec3caf762d5f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.178314 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a90e479a-2c1d-4a55-9f51-eadbc3c0b333" (UID: "a90e479a-2c1d-4a55-9f51-eadbc3c0b333"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.202170 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35ec702b-4aa6-4fa6-a770-ec3caf762d5f" (UID: "35ec702b-4aa6-4fa6-a770-ec3caf762d5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.217691 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.229691 4744 generic.go:334] "Generic (PLEG): container finished" podID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerID="eb1793ab7701df1a2087e9693648ed7fc1c77b3aadb94a5a3e17509e9cd8767b" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.230170 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerDied","Data":"eb1793ab7701df1a2087e9693648ed7fc1c77b3aadb94a5a3e17509e9cd8767b"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.242180 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90e479a-2c1d-4a55-9f51-eadbc3c0b333-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.242206 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.242215 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/35ec702b-4aa6-4fa6-a770-ec3caf762d5f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.242223 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.247335 4744 generic.go:334] "Generic (PLEG): container finished" podID="a336a32d-e322-4261-8a29-ce0f30435d83" containerID="8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.247394 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerDied","Data":"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.247653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.250933 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6b56953-c881-474c-a21f-4a39102d89ab" containerID="8485a4680d7e9bd7479321ad2c60fbdd63c7941a99f12f5159677293b38ca6db" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.250998 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerDied","Data":"8485a4680d7e9bd7479321ad2c60fbdd63c7941a99f12f5159677293b38ca6db"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.252962 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5453-account-create-update-q47sg" event={"ID":"4ed881ae-cd59-4830-99e9-34ed7708ed83","Type":"ContainerStarted","Data":"cada4d6a90a4c141db95e4f972f5ac5697df9a11b2616f59441dbfb7a05f99de"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.258411 4744 generic.go:334] "Generic (PLEG): container finished" podID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerID="9907e92d1fac188fb7c44fa1681a08073a561732eeaba685269da05b1fbc9c68" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.258434 4744 generic.go:334] "Generic (PLEG): container finished" podID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerID="915179a7516d9702ba0c2f51a83fdc1a69ce876edb13d10542894dba2430fe71" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.258479 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerDied","Data":"9907e92d1fac188fb7c44fa1681a08073a561732eeaba685269da05b1fbc9c68"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.258501 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerDied","Data":"915179a7516d9702ba0c2f51a83fdc1a69ce876edb13d10542894dba2430fe71"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.261367 4744 generic.go:334] "Generic (PLEG): container finished" podID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerID="9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.261415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerDied","Data":"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.262443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b08a-account-create-update-7dvx8" event={"ID":"20141525-666b-4609-b145-d38380a5d7c7","Type":"ContainerStarted","Data":"4a98a9ea3d5f0b7ee573e02e403c987318c602b610629a060fec7b36bf3cd7b8"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.267392 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.279366 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.300739 4744 generic.go:334] "Generic (PLEG): container finished" podID="44461324-fa82-4476-a621-c560a3c89e0f" containerID="eb9d234dbc5c2b62e281936ae9110a93aa09d1211656f0987d1551eeec652ae8" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.300815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerDied","Data":"eb9d234dbc5c2b62e281936ae9110a93aa09d1211656f0987d1551eeec652ae8"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.307537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mjl7" event={"ID":"fe2603a1-fdea-44d4-8188-f5f93324575c","Type":"ContainerDied","Data":"a39b7ab94f8b38ae1a9f49a5c072232b53543de80f123e4ecc83052d77c4e49c"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.307587 4744 scope.go:117] "RemoveContainer" containerID="50901ae2cb674520b24a492f9dec7c16fa12e3ec15657bcafeef0653de88602d" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.307700 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mjl7" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.322608 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "97c4f09f-eb97-40a0-b06c-80a5a922c986" (UID: "97c4f09f-eb97-40a0-b06c-80a5a922c986"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.335915 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "fe2603a1-fdea-44d4-8188-f5f93324575c" (UID: "fe2603a1-fdea-44d4-8188-f5f93324575c"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.342673 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-85hjj" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.342777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-85hjj" event={"ID":"a90e479a-2c1d-4a55-9f51-eadbc3c0b333","Type":"ContainerDied","Data":"86516c5d8e12a17e965bdf690e72c90c2e9beacbec6d679fd93897701dc40234"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.345769 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.345837 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.345846 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c4f09f-eb97-40a0-b06c-80a5a922c986-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.345856 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.345866 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2603a1-fdea-44d4-8188-f5f93324575c-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.352707 4744 generic.go:334] "Generic (PLEG): container finished" podID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerID="0f15de8f909facf4d53d7ef48aa1e9dab867da2d0c8b16f8059f69e45a208436" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.352784 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerDied","Data":"0f15de8f909facf4d53d7ef48aa1e9dab867da2d0c8b16f8059f69e45a208436"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.365683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "1ca73c89-992f-4b36-9a70-5d67bace9cd2" (UID: "1ca73c89-992f-4b36-9a70-5d67bace9cd2"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.370018 4744 generic.go:334] "Generic (PLEG): container finished" podID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerID="90d7864d9f44afb54c45c8f83816565799a968b37e50716fa7d940d17e944838" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.370078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerDied","Data":"90d7864d9f44afb54c45c8f83816565799a968b37e50716fa7d940d17e944838"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.388467 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_97c4f09f-eb97-40a0-b06c-80a5a922c986/ovsdbserver-nb/0.log" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.388560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"97c4f09f-eb97-40a0-b06c-80a5a922c986","Type":"ContainerDied","Data":"c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.388639 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.399962 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416450 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="5bbb97d3f04c59bd0734d58c134b140d749bfb617a14047f4d2be2a03bcf9bd5" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416478 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="98bb403c28de1a3a422f752fd836eeaf91ab8123e3a1415917dfa6a935d3dad7" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416487 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="d0f959b12a9512cb3f5a0776eb000de08da42f09e11825b7a95d1b85cdeb9533" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416494 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="b78b35fad65a6104f3b70ff15a556ecab18834b6ed00582485d71f455ffb4854" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416542 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"5bbb97d3f04c59bd0734d58c134b140d749bfb617a14047f4d2be2a03bcf9bd5"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416568 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"98bb403c28de1a3a422f752fd836eeaf91ab8123e3a1415917dfa6a935d3dad7"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"d0f959b12a9512cb3f5a0776eb000de08da42f09e11825b7a95d1b85cdeb9533"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.416587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"b78b35fad65a6104f3b70ff15a556ecab18834b6ed00582485d71f455ffb4854"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.423689 4744 generic.go:334] "Generic (PLEG): container finished" podID="755410bb-361b-47e2-8a7a-317119eec983" containerID="fd95bdd6dfcd2af9ddc8f3dd4babec6abaf1feb9cf63d194943025c2b4b217ae" exitCode=0 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.423758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"755410bb-361b-47e2-8a7a-317119eec983","Type":"ContainerDied","Data":"fd95bdd6dfcd2af9ddc8f3dd4babec6abaf1feb9cf63d194943025c2b4b217ae"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.430721 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rplmm" event={"ID":"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec","Type":"ContainerStarted","Data":"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.431216 4744 scope.go:117] "RemoveContainer" containerID="551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.436938 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-95dc-account-create-update-x25mp" event={"ID":"d8b9e96a-e188-4f50-b2a8-95729300c2d3","Type":"ContainerStarted","Data":"faf2e6fd8a6d74abf8367960f261bea53496305814e7976ff1fa97c8962adb63"} Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.439824 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 01:19:51 crc kubenswrapper[4744]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: if [ -n "nova_api" ]; then Mar 11 01:19:51 crc kubenswrapper[4744]: GRANT_DATABASE="nova_api" Mar 11 01:19:51 crc kubenswrapper[4744]: else Mar 11 01:19:51 crc kubenswrapper[4744]: GRANT_DATABASE="*" Mar 11 01:19:51 crc kubenswrapper[4744]: fi Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: # going for maximum compatibility here: Mar 11 01:19:51 crc kubenswrapper[4744]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 01:19:51 crc kubenswrapper[4744]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 01:19:51 crc kubenswrapper[4744]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 01:19:51 crc kubenswrapper[4744]: # support updates Mar 11 01:19:51 crc kubenswrapper[4744]: Mar 11 01:19:51 crc kubenswrapper[4744]: $MYSQL_CMD < logger="UnhandledError" Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.441028 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-95dc-account-create-update-x25mp" podUID="d8b9e96a-e188-4f50-b2a8-95729300c2d3" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.449972 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca73c89-992f-4b36-9a70-5d67bace9cd2-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.457293 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1ca73c89-992f-4b36-9a70-5d67bace9cd2/ovsdbserver-sb/0.log" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.457564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ca73c89-992f-4b36-9a70-5d67bace9cd2","Type":"ContainerDied","Data":"fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.457813 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.479212 4744 generic.go:334] "Generic (PLEG): container finished" podID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerID="dda45a2a6beb4d51c68422dc1406aaa80a32f7291668639a011f39ec42b2fc96" exitCode=143 Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.479286 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerDied","Data":"dda45a2a6beb4d51c68422dc1406aaa80a32f7291668639a011f39ec42b2fc96"} Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.509924 4744 scope.go:117] "RemoveContainer" containerID="7d9f021a66654b3256ddc109f1f5944e4eaa779a0ef70e35b1d3260089fa29e0" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.550008 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.568072 4744 scope.go:117] "RemoveContainer" containerID="2209bfaa2ae01c45df64e0b09d4046efe85b860f64ac4e58bdaf01ff673e8e61" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.602684 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.610206 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.624243 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.658860 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgrcg\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.658940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659079 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659132 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659197 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659284 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.659336 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle\") pod \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\" (UID: \"d60ef156-5767-43c1-bb0b-a8c681a8a6be\") " Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.669927 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.674139 4744 scope.go:117] "RemoveContainer" containerID="7fd05cd78a9ef3fd32c5308b2e3464c3c04f1b11a481396b59397efe66dc1c90" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.677793 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.680632 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db588689-85hjj"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.682036 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg" (OuterVolumeSpecName: "kube-api-access-mgrcg") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "kube-api-access-mgrcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.684687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.708865 4744 scope.go:117] "RemoveContainer" containerID="9d84530f46578e0e43683833f9c6f2305fe1b6a881fac74e31a0651183712b14" Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.712344 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3aa5cc_eae2_4d60_96cf_6d847a5599ec.slice/crio-conmon-551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod382c7504_68d5_4132_adc7_fc2c804e5d3e.slice/crio-conmon-58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod382c7504_68d5_4132_adc7_fc2c804e5d3e.slice/crio-58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3aa5cc_eae2_4d60_96cf_6d847a5599ec.slice/crio-551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca73c89_992f_4b36_9a70_5d67bace9cd2.slice/crio-fbf657f064b1024b88a33df69bdbf53ecf6487764aaf86d97c7474510786a863\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ec702b_4aa6_4fa6_a770_ec3caf762d5f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe2603a1_fdea_44d4_8188_f5f93324575c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90e479a_2c1d_4a55_9f51_eadbc3c0b333.slice/crio-86516c5d8e12a17e965bdf690e72c90c2e9beacbec6d679fd93897701dc40234\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c4f09f_eb97_40a0_b06c_80a5a922c986.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca73c89_992f_4b36_9a70_5d67bace9cd2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c4f09f_eb97_40a0_b06c_80a5a922c986.slice/crio-c315daedfc27461657005479c8ea62043ffb53a3d4ad9aea00f2979b4a09e719\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ec702b_4aa6_4fa6_a770_ec3caf762d5f.slice/crio-254b5d84d26672423311ca593c8fbaea7b9fe875fc7a778ac56027267934c2ec\": RecentStats: unable to find data in memory cache]" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.727756 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.742888 4744 scope.go:117] "RemoveContainer" containerID="fa97b985b022afa1a47a024d8d1193a6165049de48146766011f761fe39d7dce" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.749038 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.751445 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.756777 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.762786 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.764341 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgrcg\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-kube-api-access-mgrcg\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.764442 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.764526 4744 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d60ef156-5767-43c1-bb0b-a8c681a8a6be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.764583 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d60ef156-5767-43c1-bb0b-a8c681a8a6be-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.764858 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.764914 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data podName:fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:55.764898997 +0000 UTC m=+1552.569116602 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data") pod "rabbitmq-server-0" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9") : configmap "rabbitmq-config-data" not found Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.784062 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.787819 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.796327 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mjl7"] Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.802689 4744 scope.go:117] "RemoveContainer" containerID="223e84951f31263619072ddbe2e0f719244b2be18159a6ed4fd2baa8d2a4de50" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.817900 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data" (OuterVolumeSpecName: "config-data") pod "d60ef156-5767-43c1-bb0b-a8c681a8a6be" (UID: "d60ef156-5767-43c1-bb0b-a8c681a8a6be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.879761 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.879872 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.879925 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d60ef156-5767-43c1-bb0b-a8c681a8a6be-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.898689 4744 scope.go:117] "RemoveContainer" containerID="21669e2f41b8409f07be3ecda90e7e2a179ecbcdbcef99ca2a140c11a5fb6bc4" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.970181 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.979627 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.979737 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.987106 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.987260 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.988417 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.988449 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.988933 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:51 crc kubenswrapper[4744]: E0311 01:19:51.988958 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:19:51 crc kubenswrapper[4744]: I0311 01:19:51.990750 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.011854 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" path="/var/lib/kubelet/pods/1ca73c89-992f-4b36-9a70-5d67bace9cd2/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.012275 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.013010 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ec702b-4aa6-4fa6-a770-ec3caf762d5f" path="/var/lib/kubelet/pods/35ec702b-4aa6-4fa6-a770-ec3caf762d5f/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.015587 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4391b558-59d1-4f5c-8e1e-cbf9667d6544" path="/var/lib/kubelet/pods/4391b558-59d1-4f5c-8e1e-cbf9667d6544/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.016055 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c80e1ba-a26a-4368-902b-a725bc2052d8" path="/var/lib/kubelet/pods/8c80e1ba-a26a-4368-902b-a725bc2052d8/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.016639 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" path="/var/lib/kubelet/pods/97c4f09f-eb97-40a0-b06c-80a5a922c986/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.017649 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aafed50-ec97-470d-b2b3-ed2984c5bc7e" path="/var/lib/kubelet/pods/9aafed50-ec97-470d-b2b3-ed2984c5bc7e/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.018136 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" path="/var/lib/kubelet/pods/a90e479a-2c1d-4a55-9f51-eadbc3c0b333/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.018698 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7669ed4-670c-4a2c-8ce2-b69579c30e15" path="/var/lib/kubelet/pods/b7669ed4-670c-4a2c-8ce2-b69579c30e15/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.019596 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c263c020-5938-4b77-b265-c297ae87f084" path="/var/lib/kubelet/pods/c263c020-5938-4b77-b265-c297ae87f084/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.020065 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e122b7f6-6664-4484-afc0-c5629ad3a7e3" path="/var/lib/kubelet/pods/e122b7f6-6664-4484-afc0-c5629ad3a7e3/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.020576 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" path="/var/lib/kubelet/pods/fe2603a1-fdea-44d4-8188-f5f93324575c/volumes" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.021941 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.082591 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22rr\" (UniqueName: \"kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr\") pod \"20141525-666b-4609-b145-d38380a5d7c7\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.082907 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data\") pod \"755410bb-361b-47e2-8a7a-317119eec983\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.091690 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts\") pod \"20141525-666b-4609-b145-d38380a5d7c7\" (UID: \"20141525-666b-4609-b145-d38380a5d7c7\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.091772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs\") pod \"755410bb-361b-47e2-8a7a-317119eec983\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.091845 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle\") pod \"755410bb-361b-47e2-8a7a-317119eec983\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.091889 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8kpr\" (UniqueName: \"kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr\") pod \"755410bb-361b-47e2-8a7a-317119eec983\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.091925 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs\") pod \"755410bb-361b-47e2-8a7a-317119eec983\" (UID: \"755410bb-361b-47e2-8a7a-317119eec983\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.093236 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20141525-666b-4609-b145-d38380a5d7c7" (UID: "20141525-666b-4609-b145-d38380a5d7c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.095249 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr" (OuterVolumeSpecName: "kube-api-access-q22rr") pod "20141525-666b-4609-b145-d38380a5d7c7" (UID: "20141525-666b-4609-b145-d38380a5d7c7"). InnerVolumeSpecName "kube-api-access-q22rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.115018 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr" (OuterVolumeSpecName: "kube-api-access-q8kpr") pod "755410bb-361b-47e2-8a7a-317119eec983" (UID: "755410bb-361b-47e2-8a7a-317119eec983"). InnerVolumeSpecName "kube-api-access-q8kpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.118425 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data" (OuterVolumeSpecName: "config-data") pod "755410bb-361b-47e2-8a7a-317119eec983" (UID: "755410bb-361b-47e2-8a7a-317119eec983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.127246 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "755410bb-361b-47e2-8a7a-317119eec983" (UID: "755410bb-361b-47e2-8a7a-317119eec983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.136541 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.141304 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.172998 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.173064 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195000 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195105 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195177 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jhv\" (UniqueName: \"kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv\") pod \"4ed881ae-cd59-4830-99e9-34ed7708ed83\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195242 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195303 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195324 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lqpw\" (UniqueName: \"kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195367 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195403 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts\") pod \"4ed881ae-cd59-4830-99e9-34ed7708ed83\" (UID: \"4ed881ae-cd59-4830-99e9-34ed7708ed83\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195436 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle\") pod \"382c7504-68d5-4132-adc7-fc2c804e5d3e\" (UID: \"382c7504-68d5-4132-adc7-fc2c804e5d3e\") " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195799 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22rr\" (UniqueName: \"kubernetes.io/projected/20141525-666b-4609-b145-d38380a5d7c7-kube-api-access-q22rr\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195810 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195819 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20141525-666b-4609-b145-d38380a5d7c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195827 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.195836 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8kpr\" (UniqueName: \"kubernetes.io/projected/755410bb-361b-47e2-8a7a-317119eec983-kube-api-access-q8kpr\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.197041 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.197831 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.198592 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ed881ae-cd59-4830-99e9-34ed7708ed83" (UID: "4ed881ae-cd59-4830-99e9-34ed7708ed83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.200791 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.205194 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.227984 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw" (OuterVolumeSpecName: "kube-api-access-5lqpw") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "kube-api-access-5lqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.255681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv" (OuterVolumeSpecName: "kube-api-access-m5jhv") pod "4ed881ae-cd59-4830-99e9-34ed7708ed83" (UID: "4ed881ae-cd59-4830-99e9-34ed7708ed83"). InnerVolumeSpecName "kube-api-access-m5jhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.261723 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "755410bb-361b-47e2-8a7a-317119eec983" (UID: "755410bb-361b-47e2-8a7a-317119eec983"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.297748 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298917 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ed881ae-cd59-4830-99e9-34ed7708ed83-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298930 4744 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298940 4744 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298950 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jhv\" (UniqueName: \"kubernetes.io/projected/4ed881ae-cd59-4830-99e9-34ed7708ed83-kube-api-access-m5jhv\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298958 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298966 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c7504-68d5-4132-adc7-fc2c804e5d3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298982 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298990 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lqpw\" (UniqueName: \"kubernetes.io/projected/382c7504-68d5-4132-adc7-fc2c804e5d3e-kube-api-access-5lqpw\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.298998 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c7504-68d5-4132-adc7-fc2c804e5d3e-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.299404 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.299458 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data podName:714c91e5-04c5-4f95-97e3-a3c08664944d nodeName:}" failed. No retries permitted until 2026-03-11 01:19:56.29944173 +0000 UTC m=+1553.103659335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data") pod "rabbitmq-cell1-server-0" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d") : configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.418712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "755410bb-361b-47e2-8a7a-317119eec983" (UID: "755410bb-361b-47e2-8a7a-317119eec983"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.429542 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.433450 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.436930 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "382c7504-68d5-4132-adc7-fc2c804e5d3e" (UID: "382c7504-68d5-4132-adc7-fc2c804e5d3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.490168 4744 generic.go:334] "Generic (PLEG): container finished" podID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" exitCode=0 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.490303 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerDied","Data":"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.490402 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c7504-68d5-4132-adc7-fc2c804e5d3e","Type":"ContainerDied","Data":"ac3725eaefc0d00cf9a08b71208875c6bd6da5cb355883a516a0d2d44c9a7b9d"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.490476 4744 scope.go:117] "RemoveContainer" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.490644 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.498429 4744 generic.go:334] "Generic (PLEG): container finished" podID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerID="551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c" exitCode=1 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.498564 4744 generic.go:334] "Generic (PLEG): container finished" podID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerID="b93a05d9c05b761944f0a54e2a10841bc55777d4b1e2fe855855e9122f7a3fdb" exitCode=1 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.498654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rplmm" event={"ID":"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec","Type":"ContainerDied","Data":"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.498775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rplmm" event={"ID":"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec","Type":"ContainerDied","Data":"b93a05d9c05b761944f0a54e2a10841bc55777d4b1e2fe855855e9122f7a3fdb"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.500437 4744 generic.go:334] "Generic (PLEG): container finished" podID="e910960b-a434-4830-b4be-96571fa4dd54" containerID="ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b" exitCode=0 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.500537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerDied","Data":"ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.505697 4744 scope.go:117] "RemoveContainer" containerID="b93a05d9c05b761944f0a54e2a10841bc55777d4b1e2fe855855e9122f7a3fdb" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.505937 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-rplmm_openstack(7f3aa5cc-eae2-4d60-96cf-6d847a5599ec)\"" pod="openstack/root-account-create-update-rplmm" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.506703 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.506714 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.506724 4744 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/755410bb-361b-47e2-8a7a-317119eec983-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.506731 4744 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c7504-68d5-4132-adc7-fc2c804e5d3e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.510727 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5453-account-create-update-q47sg" event={"ID":"4ed881ae-cd59-4830-99e9-34ed7708ed83","Type":"ContainerDied","Data":"cada4d6a90a4c141db95e4f972f5ac5697df9a11b2616f59441dbfb7a05f99de"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.510842 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5453-account-create-update-q47sg" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.528557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" event={"ID":"d60ef156-5767-43c1-bb0b-a8c681a8a6be","Type":"ContainerDied","Data":"286eb384cb18b19ed1345645d29d8040537d31182d97875d0d7b23e2df75caa2"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.528662 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.530722 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b08a-account-create-update-7dvx8" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.530733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b08a-account-create-update-7dvx8" event={"ID":"20141525-666b-4609-b145-d38380a5d7c7","Type":"ContainerDied","Data":"4a98a9ea3d5f0b7ee573e02e403c987318c602b610629a060fec7b36bf3cd7b8"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.532596 4744 scope.go:117] "RemoveContainer" containerID="c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.538777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"755410bb-361b-47e2-8a7a-317119eec983","Type":"ContainerDied","Data":"a3fe2f00533c3bbd5584921efe34313f88a72aac2e80a0dc9792ce9767bc2e5a"} Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.538856 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.589596 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.614462 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.617394 4744 scope.go:117] "RemoveContainer" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.617836 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28\": container with ID starting with 58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28 not found: ID does not exist" containerID="58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.617883 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28"} err="failed to get container status \"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28\": rpc error: code = NotFound desc = could not find container \"58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28\": container with ID starting with 58fdd5d28d27f99ee68a950bea9a0ce4833daad80356b7d09eee6123d0427f28 not found: ID does not exist" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.617909 4744 scope.go:117] "RemoveContainer" containerID="c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.618177 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454\": container with ID starting with c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454 not found: ID does not exist" containerID="c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.618200 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454"} err="failed to get container status \"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454\": rpc error: code = NotFound desc = could not find container \"c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454\": container with ID starting with c27361c775bdd047e5a87d9b776a1160e98c796ab972f2c1582f331a91ebe454 not found: ID does not exist" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.618213 4744 scope.go:117] "RemoveContainer" containerID="551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.632754 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.638917 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6bb49fdf95-9g9dz"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.651444 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.658338 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5453-account-create-update-q47sg"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.664037 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.669569 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.670967 4744 scope.go:117] "RemoveContainer" containerID="551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c" Mar 11 01:19:52 crc kubenswrapper[4744]: E0311 01:19:52.671338 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c\": container with ID starting with 551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c not found: ID does not exist" containerID="551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.671389 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c"} err="failed to get container status \"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c\": rpc error: code = NotFound desc = could not find container \"551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c\": container with ID starting with 551326012aadac601469c35ef336268bf20b74279ceb1776275c88a0a8b34b6c not found: ID does not exist" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.671416 4744 scope.go:117] "RemoveContainer" containerID="9907e92d1fac188fb7c44fa1681a08073a561732eeaba685269da05b1fbc9c68" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.680716 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.702145 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b08a-account-create-update-7dvx8"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.702746 4744 scope.go:117] "RemoveContainer" containerID="915179a7516d9702ba0c2f51a83fdc1a69ce876edb13d10542894dba2430fe71" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.756665 4744 scope.go:117] "RemoveContainer" containerID="fd95bdd6dfcd2af9ddc8f3dd4babec6abaf1feb9cf63d194943025c2b4b217ae" Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.842647 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.842909 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-central-agent" containerID="cri-o://6940392b1cc755e200810e7dda5a0cdf602c2bc1bee93993b5d7b849c244decb" gracePeriod=30 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.843038 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="proxy-httpd" containerID="cri-o://2e90fe156899af91f60cc58374a4704b928fb93dc5a2b8a016f190e0a0897fe6" gracePeriod=30 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.843073 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="sg-core" containerID="cri-o://3d1ff7ea2ca1a4a7692ee87d6f5ba883dc249a72e783a373c386078d938baf27" gracePeriod=30 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.843104 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-notification-agent" containerID="cri-o://1252cd0d658ee3e2e06be9771503f0ab0664f8d814b22e653a029a2d6d6716c4" gracePeriod=30 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.866478 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.869686 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cc403516-137f-4bfb-badf-89b13ff0468f" containerName="kube-state-metrics" containerID="cri-o://97d144a3e5f0e96a87be2e26bd2f1bc7509fe6d661b46663a89c2c893e10f70a" gracePeriod=30 Mar 11 01:19:52 crc kubenswrapper[4744]: I0311 01:19:52.970010 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ab0f-account-create-update-fj8mx"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.007245 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ab0f-account-create-update-fj8mx"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.057357 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.057558 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="730c901d-c3c5-46c5-b618-00cdcc17bef2" containerName="memcached" containerID="cri-o://041f61378d1b6288e78cda19110d0e5a7b5f6590d353d62737e351291c23ef4b" gracePeriod=30 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.079669 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ab0f-account-create-update-n4gnt"] Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080352 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="init" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080383 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="init" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080397 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080403 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080414 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="ovsdbserver-sb" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080420 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="ovsdbserver-sb" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080428 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-httpd" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080450 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-httpd" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080543 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-server" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080550 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-server" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080562 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="dnsmasq-dns" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080569 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="dnsmasq-dns" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080591 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="ovsdbserver-nb" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080597 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="ovsdbserver-nb" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080608 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755410bb-361b-47e2-8a7a-317119eec983" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080614 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="755410bb-361b-47e2-8a7a-317119eec983" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080624 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080630 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080643 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="galera" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080694 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="galera" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080702 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080710 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080719 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="mysql-bootstrap" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080725 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="mysql-bootstrap" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.080775 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.080782 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083080 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="755410bb-361b-47e2-8a7a-317119eec983" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083117 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-server" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083131 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083148 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083160 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" containerName="galera" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083170 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-httpd" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083187 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" containerName="openstack-network-exporter" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083194 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2603a1-fdea-44d4-8188-f5f93324575c" containerName="ovn-controller" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083205 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90e479a-2c1d-4a55-9f51-eadbc3c0b333" containerName="dnsmasq-dns" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083217 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c4f09f-eb97-40a0-b06c-80a5a922c986" containerName="ovsdbserver-nb" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.083229 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca73c89-992f-4b36-9a70-5d67bace9cd2" containerName="ovsdbserver-sb" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.084212 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.088850 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.100389 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.118431 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab0f-account-create-update-n4gnt"] Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.130447 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.173679 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.178217 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.178278 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" containerName="nova-cell0-conductor-conductor" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.186469 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p8czs"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.231484 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p8czs"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.253577 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-r8hsb"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.259224 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-r8hsb"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.274568 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.290575 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4wphb"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.295938 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ab0f-account-create-update-n4gnt"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.298198 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfc6b\" (UniqueName: \"kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b\") pod \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.298275 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts\") pod \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\" (UID: \"d8b9e96a-e188-4f50-b2a8-95729300c2d3\") " Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.298695 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.298776 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9w5\" (UniqueName: \"kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.299743 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8b9e96a-e188-4f50-b2a8-95729300c2d3" (UID: "d8b9e96a-e188-4f50-b2a8-95729300c2d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.301469 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.301705 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7cc5cb746-kmb4g" podUID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" containerName="keystone-api" containerID="cri-o://a5f82d457fa59f44d576003aefd017ef2285b45e6c69abd14e7eb7f7df02fc09" gracePeriod=30 Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.308487 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-sc9w5 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-ab0f-account-create-update-n4gnt" podUID="4a418ad0-7ff0-474b-944a-71418601beb9" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.308582 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4wphb"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.313967 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.323095 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b" (OuterVolumeSpecName: "kube-api-access-wfc6b") pod "d8b9e96a-e188-4f50-b2a8-95729300c2d3" (UID: "d8b9e96a-e188-4f50-b2a8-95729300c2d3"). InnerVolumeSpecName "kube-api-access-wfc6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.400793 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.400896 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9w5\" (UniqueName: \"kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.400981 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfc6b\" (UniqueName: \"kubernetes.io/projected/d8b9e96a-e188-4f50-b2a8-95729300c2d3-kube-api-access-wfc6b\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.400993 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b9e96a-e188-4f50-b2a8-95729300c2d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.402669 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.402745 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:53.902728034 +0000 UTC m=+1550.706945629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.406536 4744 projected.go:194] Error preparing data for projected volume kube-api-access-sc9w5 for pod openstack/keystone-ab0f-account-create-update-n4gnt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.406596 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5 podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:53.906580143 +0000 UTC m=+1550.710797748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sc9w5" (UniqueName: "kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.524181 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.524751 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.586076 4744 generic.go:334] "Generic (PLEG): container finished" podID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerID="3d1ff7ea2ca1a4a7692ee87d6f5ba883dc249a72e783a373c386078d938baf27" exitCode=2 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.586132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerDied","Data":"3d1ff7ea2ca1a4a7692ee87d6f5ba883dc249a72e783a373c386078d938baf27"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.599778 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6b56953-c881-474c-a21f-4a39102d89ab" containerID="d932b2416d71a86f80ef3581b5216ce6ba10ab543bf647b40daeebe0c83edbaa" exitCode=0 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.599849 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerDied","Data":"d932b2416d71a86f80ef3581b5216ce6ba10ab543bf647b40daeebe0c83edbaa"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.761250 4744 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-rplmm" secret="" err="secret \"galera-openstack-dockercfg-znjfv\" not found" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.761295 4744 scope.go:117] "RemoveContainer" containerID="b93a05d9c05b761944f0a54e2a10841bc55777d4b1e2fe855855e9122f7a3fdb" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.761576 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-rplmm_openstack(7f3aa5cc-eae2-4d60-96cf-6d847a5599ec)\"" pod="openstack/root-account-create-update-rplmm" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.799013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-95dc-account-create-update-x25mp" event={"ID":"d8b9e96a-e188-4f50-b2a8-95729300c2d3","Type":"ContainerDied","Data":"faf2e6fd8a6d74abf8367960f261bea53496305814e7976ff1fa97c8962adb63"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.799130 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-95dc-account-create-update-x25mp" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.847973 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.857984 4744 generic.go:334] "Generic (PLEG): container finished" podID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerID="42a9b4781df7a35e2423fc3e40c26d692ed2ab1efd0387967762e554fa2952fa" exitCode=0 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.858065 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerDied","Data":"42a9b4781df7a35e2423fc3e40c26d692ed2ab1efd0387967762e554fa2952fa"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.861025 4744 generic.go:334] "Generic (PLEG): container finished" podID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerID="55772af813a2f508c881ce1d4bcdd6c3b028b1ce12f20693da7faae23420077f" exitCode=0 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.861091 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerDied","Data":"55772af813a2f508c881ce1d4bcdd6c3b028b1ce12f20693da7faae23420077f"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.864354 4744 generic.go:334] "Generic (PLEG): container finished" podID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerID="5be8527746802ca45e3d648c31b7d6bd21a5227a4e2fd3aeb4816ee96bea245a" exitCode=0 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.864420 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerDied","Data":"5be8527746802ca45e3d648c31b7d6bd21a5227a4e2fd3aeb4816ee96bea245a"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.866897 4744 generic.go:334] "Generic (PLEG): container finished" podID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerID="d169bd1e2373ec6a70d9c8ea39075ed35856b33455affec0b49a8f185690d2f6" exitCode=0 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.866937 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerDied","Data":"d169bd1e2373ec6a70d9c8ea39075ed35856b33455affec0b49a8f185690d2f6"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.876577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerStarted","Data":"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.878935 4744 generic.go:334] "Generic (PLEG): container finished" podID="cc403516-137f-4bfb-badf-89b13ff0468f" containerID="97d144a3e5f0e96a87be2e26bd2f1bc7509fe6d661b46663a89c2c893e10f70a" exitCode=2 Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.878975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.879041 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.879157 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc403516-137f-4bfb-badf-89b13ff0468f","Type":"ContainerDied","Data":"97d144a3e5f0e96a87be2e26bd2f1bc7509fe6d661b46663a89c2c893e10f70a"} Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.879189 4744 scope.go:117] "RemoveContainer" containerID="97d144a3e5f0e96a87be2e26bd2f1bc7509fe6d661b46663a89c2c893e10f70a" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.880050 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": dial tcp 10.217.0.170:8776: connect: connection refused" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.917698 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.921744 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.930136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9w5\" (UniqueName: \"kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.931288 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.931363 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts podName:7f3aa5cc-eae2-4d60-96cf-6d847a5599ec nodeName:}" failed. No retries permitted until 2026-03-11 01:19:54.431345704 +0000 UTC m=+1551.235563309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts") pod "root-account-create-update-rplmm" (UID: "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec") : configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.934317 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.934575 4744 projected.go:194] Error preparing data for projected volume kube-api-access-sc9w5 for pod openstack/keystone-ab0f-account-create-update-n4gnt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.934635 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5 podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:54.934619375 +0000 UTC m=+1551.738836970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sc9w5" (UniqueName: "kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.934681 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: E0311 01:19:53.934709 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:54.934703118 +0000 UTC m=+1551.738920723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : configmap "openstack-scripts" not found Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.939030 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:53 crc kubenswrapper[4744]: I0311 01:19:53.945197 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-95dc-account-create-update-x25mp"] Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.015310 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20141525-666b-4609-b145-d38380a5d7c7" path="/var/lib/kubelet/pods/20141525-666b-4609-b145-d38380a5d7c7/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.015722 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30905e0e-95fa-4d7c-b586-f02ef591dc1d" path="/var/lib/kubelet/pods/30905e0e-95fa-4d7c-b586-f02ef591dc1d/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.016346 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382c7504-68d5-4132-adc7-fc2c804e5d3e" path="/var/lib/kubelet/pods/382c7504-68d5-4132-adc7-fc2c804e5d3e/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.018728 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="galera" containerID="cri-o://364c3fa121dddb8ffc883e3753ab0a397a68bab1a3f4f81bef538a6bf6da07b9" gracePeriod=30 Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.019392 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d70370e-fce0-48d8-9856-06e04916e905" path="/var/lib/kubelet/pods/3d70370e-fce0-48d8-9856-06e04916e905/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.019579 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mw9fv" podStartSLOduration=5.067812522 podStartE2EDuration="8.019558543s" podCreationTimestamp="2026-03-11 01:19:46 +0000 UTC" firstStartedPulling="2026-03-11 01:19:50.042213473 +0000 UTC m=+1546.846431068" lastFinishedPulling="2026-03-11 01:19:52.993959484 +0000 UTC m=+1549.798177089" observedRunningTime="2026-03-11 01:19:53.970705162 +0000 UTC m=+1550.774922767" watchObservedRunningTime="2026-03-11 01:19:54.019558543 +0000 UTC m=+1550.823776148" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.019878 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed881ae-cd59-4830-99e9-34ed7708ed83" path="/var/lib/kubelet/pods/4ed881ae-cd59-4830-99e9-34ed7708ed83/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.020212 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755410bb-361b-47e2-8a7a-317119eec983" path="/var/lib/kubelet/pods/755410bb-361b-47e2-8a7a-317119eec983/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.021155 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810da0cb-5013-4997-84ba-4437bce2a20d" path="/var/lib/kubelet/pods/810da0cb-5013-4997-84ba-4437bce2a20d/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.024427 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" path="/var/lib/kubelet/pods/d60ef156-5767-43c1-bb0b-a8c681a8a6be/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.024953 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b9e96a-e188-4f50-b2a8-95729300c2d3" path="/var/lib/kubelet/pods/d8b9e96a-e188-4f50-b2a8-95729300c2d3/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.025286 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da71c584-ebbd-42c0-96c9-716bbd47efce" path="/var/lib/kubelet/pods/da71c584-ebbd-42c0-96c9-716bbd47efce/volumes" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043151 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbw4k\" (UniqueName: \"kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043238 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043308 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config\") pod \"cc403516-137f-4bfb-badf-89b13ff0468f\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs\") pod \"cc403516-137f-4bfb-badf-89b13ff0468f\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043382 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22d5\" (UniqueName: \"kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5\") pod \"cc403516-137f-4bfb-badf-89b13ff0468f\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043438 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle\") pod \"cc403516-137f-4bfb-badf-89b13ff0468f\" (UID: \"cc403516-137f-4bfb-badf-89b13ff0468f\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.043483 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs\") pod \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\" (UID: \"23fcfdba-12bc-4a94-94cd-fb703f2e632c\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.044996 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs" (OuterVolumeSpecName: "logs") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.063876 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k" (OuterVolumeSpecName: "kube-api-access-bbw4k") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "kube-api-access-bbw4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.068118 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5" (OuterVolumeSpecName: "kube-api-access-q22d5") pod "cc403516-137f-4bfb-badf-89b13ff0468f" (UID: "cc403516-137f-4bfb-badf-89b13ff0468f"). InnerVolumeSpecName "kube-api-access-q22d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.118886 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.145970 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.146002 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22d5\" (UniqueName: \"kubernetes.io/projected/cc403516-137f-4bfb-badf-89b13ff0468f-kube-api-access-q22d5\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.146015 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fcfdba-12bc-4a94-94cd-fb703f2e632c-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.146027 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbw4k\" (UniqueName: \"kubernetes.io/projected/23fcfdba-12bc-4a94-94cd-fb703f2e632c-kube-api-access-bbw4k\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.157816 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data" (OuterVolumeSpecName: "config-data") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.161251 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc403516-137f-4bfb-badf-89b13ff0468f" (UID: "cc403516-137f-4bfb-badf-89b13ff0468f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.199649 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "cc403516-137f-4bfb-badf-89b13ff0468f" (UID: "cc403516-137f-4bfb-badf-89b13ff0468f"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.223408 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.278082 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "cc403516-137f-4bfb-badf-89b13ff0468f" (UID: "cc403516-137f-4bfb-badf-89b13ff0468f"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.283083 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23fcfdba-12bc-4a94-94cd-fb703f2e632c" (UID: "23fcfdba-12bc-4a94-94cd-fb703f2e632c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297348 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297399 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297413 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fcfdba-12bc-4a94-94cd-fb703f2e632c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297426 4744 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297445 4744 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.297459 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc403516-137f-4bfb-badf-89b13ff0468f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:54 crc kubenswrapper[4744]: E0311 01:19:54.500288 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:54 crc kubenswrapper[4744]: E0311 01:19:54.500659 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts podName:7f3aa5cc-eae2-4d60-96cf-6d847a5599ec nodeName:}" failed. No retries permitted until 2026-03-11 01:19:55.500644663 +0000 UTC m=+1552.304862268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts") pod "root-account-create-update-rplmm" (UID: "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec") : configmap "openstack-scripts" not found Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.563085 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.606706 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.620634 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.646382 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.663126 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.665780 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.665890 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.677176 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.684737 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704357 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704396 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704440 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs\") pod \"b9edcd5c-3634-45f9-914a-0d8e4f425302\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704467 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97rcj\" (UniqueName: \"kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704487 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704522 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704541 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704554 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwpz8\" (UniqueName: \"kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704570 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704587 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6pmg\" (UniqueName: \"kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg\") pod \"b9edcd5c-3634-45f9-914a-0d8e4f425302\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704675 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle\") pod \"b9edcd5c-3634-45f9-914a-0d8e4f425302\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704698 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704717 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5hh9\" (UniqueName: \"kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704759 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704775 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704790 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704832 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704866 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704884 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704901 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704956 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-997jj\" (UniqueName: \"kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.704982 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs\") pod \"b9edcd5c-3634-45f9-914a-0d8e4f425302\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705002 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705027 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data\") pod \"b9edcd5c-3634-45f9-914a-0d8e4f425302\" (UID: \"b9edcd5c-3634-45f9-914a-0d8e4f425302\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705066 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705133 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data\") pod \"4767cbee-21c4-4deb-871a-9c6169f5741d\" (UID: \"4767cbee-21c4-4deb-871a-9c6169f5741d\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705151 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data\") pod \"f11c8953-d88f-4d37-8366-b0b61606fa8a\" (UID: \"f11c8953-d88f-4d37-8366-b0b61606fa8a\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705167 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs\") pod \"a6b56953-c881-474c-a21f-4a39102d89ab\" (UID: \"a6b56953-c881-474c-a21f-4a39102d89ab\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.705184 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data\") pod \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\" (UID: \"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119\") " Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.714572 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs" (OuterVolumeSpecName: "logs") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.714820 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs" (OuterVolumeSpecName: "logs") pod "b9edcd5c-3634-45f9-914a-0d8e4f425302" (UID: "b9edcd5c-3634-45f9-914a-0d8e4f425302"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.714995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs" (OuterVolumeSpecName: "logs") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.715315 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs" (OuterVolumeSpecName: "logs") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.718107 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj" (OuterVolumeSpecName: "kube-api-access-997jj") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "kube-api-access-997jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.720064 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs" (OuterVolumeSpecName: "logs") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.720674 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8" (OuterVolumeSpecName: "kube-api-access-lwpz8") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "kube-api-access-lwpz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.720714 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.735590 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.765537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9" (OuterVolumeSpecName: "kube-api-access-t5hh9") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "kube-api-access-t5hh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.767548 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts" (OuterVolumeSpecName: "scripts") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.768620 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg" (OuterVolumeSpecName: "kube-api-access-f6pmg") pod "b9edcd5c-3634-45f9-914a-0d8e4f425302" (UID: "b9edcd5c-3634-45f9-914a-0d8e4f425302"). InnerVolumeSpecName "kube-api-access-f6pmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.769161 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:54 crc kubenswrapper[4744]: I0311 01:19:54.794234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts" (OuterVolumeSpecName: "scripts") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810359 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810384 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-997jj\" (UniqueName: \"kubernetes.io/projected/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-kube-api-access-997jj\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810396 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810405 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11c8953-d88f-4d37-8366-b0b61606fa8a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810413 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810424 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810433 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9edcd5c-3634-45f9-914a-0d8e4f425302-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810442 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f11c8953-d88f-4d37-8366-b0b61606fa8a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810453 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwpz8\" (UniqueName: \"kubernetes.io/projected/f11c8953-d88f-4d37-8366-b0b61606fa8a-kube-api-access-lwpz8\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810462 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6b56953-c881-474c-a21f-4a39102d89ab-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810471 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6pmg\" (UniqueName: \"kubernetes.io/projected/b9edcd5c-3634-45f9-914a-0d8e4f425302-kube-api-access-f6pmg\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810479 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810490 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5hh9\" (UniqueName: \"kubernetes.io/projected/4767cbee-21c4-4deb-871a-9c6169f5741d-kube-api-access-t5hh9\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.810500 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4767cbee-21c4-4deb-871a-9c6169f5741d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.817676 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.824204 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.829854 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts" (OuterVolumeSpecName: "scripts") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.834766 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts" (OuterVolumeSpecName: "scripts") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.845772 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.845878 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.848646 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj" (OuterVolumeSpecName: "kube-api-access-97rcj") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "kube-api-access-97rcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.894852 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4767cbee-21c4-4deb-871a-9c6169f5741d","Type":"ContainerDied","Data":"b86d7ab1af8b5ed8530de243a360bfc4c90e4b884444c461b1c0070bbaa146fe"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.895085 4744 scope.go:117] "RemoveContainer" containerID="42a9b4781df7a35e2423fc3e40c26d692ed2ab1efd0387967762e554fa2952fa" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.895234 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.901178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9edcd5c-3634-45f9-914a-0d8e4f425302","Type":"ContainerDied","Data":"ad20406abed1d97d21a11032cbec2fef3caf71dbf971517a038b8a1a756e3205"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.901225 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.905379 4744 generic.go:334] "Generic (PLEG): container finished" podID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerID="2e90fe156899af91f60cc58374a4704b928fb93dc5a2b8a016f190e0a0897fe6" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.905401 4744 generic.go:334] "Generic (PLEG): container finished" podID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerID="6940392b1cc755e200810e7dda5a0cdf602c2bc1bee93993b5d7b849c244decb" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.905429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerDied","Data":"2e90fe156899af91f60cc58374a4704b928fb93dc5a2b8a016f190e0a0897fe6"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.905445 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerDied","Data":"6940392b1cc755e200810e7dda5a0cdf602c2bc1bee93993b5d7b849c244decb"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.907100 4744 generic.go:334] "Generic (PLEG): container finished" podID="44461324-fa82-4476-a621-c560a3c89e0f" containerID="347f55903065cc04c7bddba453ffd2605f2f89f0b3744727e7f308c9383fd8e9" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.907127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerDied","Data":"347f55903065cc04c7bddba453ffd2605f2f89f0b3744727e7f308c9383fd8e9"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.908782 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f78d57d44-gt8df" event={"ID":"a6b56953-c881-474c-a21f-4a39102d89ab","Type":"ContainerDied","Data":"10ffbd2dc9aa299a1bcdfba5f1bc20555f968f1ff6c4e1e8f6e696a4f9bc8bfe"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.908862 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f78d57d44-gt8df" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.911991 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912013 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912023 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912032 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912042 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97rcj\" (UniqueName: \"kubernetes.io/projected/a6b56953-c881-474c-a21f-4a39102d89ab-kube-api-access-97rcj\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912051 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.912064 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.916601 4744 generic.go:334] "Generic (PLEG): container finished" podID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerID="0a9c241eccc912eedc93ca498ce49b3438428bba9d63396a61de516e8822d3fa" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.916638 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerDied","Data":"0a9c241eccc912eedc93ca498ce49b3438428bba9d63396a61de516e8822d3fa"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.916657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" event={"ID":"8df98dbd-473b-4630-81ab-edd6419feb0d","Type":"ContainerDied","Data":"0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.916667 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0618174e67d0f1492cd70819d82f71b1f61c9a68db51461c8c56789026a191ea" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.919165 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23fcfdba-12bc-4a94-94cd-fb703f2e632c","Type":"ContainerDied","Data":"45928c1c60e24f1cec1e772f91c0360d621398cc7ddaec6f9413dcb87f97c21b"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.919265 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.924789 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.926094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae4f1d0a-32b7-4ec5-9f5b-a43589af4119","Type":"ContainerDied","Data":"84b62c373e1da76600505f19a32878af7c4c1da368de43305fffec91ec07eea2"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.926120 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.930884 4744 generic.go:334] "Generic (PLEG): container finished" podID="c8caed76-baba-4ad3-b95a-e428132f2021" containerID="a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.930929 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8caed76-baba-4ad3-b95a-e428132f2021","Type":"ContainerDied","Data":"a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.932107 4744 generic.go:334] "Generic (PLEG): container finished" podID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerID="9a1f61db53cc92beec4b876c5f655cbd7b0389b0b62b4ff5bc3cc4d0c15ec01a" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.932167 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerDied","Data":"9a1f61db53cc92beec4b876c5f655cbd7b0389b0b62b4ff5bc3cc4d0c15ec01a"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.933389 4744 generic.go:334] "Generic (PLEG): container finished" podID="730c901d-c3c5-46c5-b618-00cdcc17bef2" containerID="041f61378d1b6288e78cda19110d0e5a7b5f6590d353d62737e351291c23ef4b" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.933449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"730c901d-c3c5-46c5-b618-00cdcc17bef2","Type":"ContainerDied","Data":"041f61378d1b6288e78cda19110d0e5a7b5f6590d353d62737e351291c23ef4b"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.939816 4744 generic.go:334] "Generic (PLEG): container finished" podID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerID="c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9" exitCode=0 Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.940159 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.940225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerDied","Data":"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.940253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f11c8953-d88f-4d37-8366-b0b61606fa8a","Type":"ContainerDied","Data":"9558ffd0f1554b64add098d174b47d8e3e27443e41618f4f6bfcf13fbb94142b"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.941795 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.948850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9edcd5c-3634-45f9-914a-0d8e4f425302" (UID: "b9edcd5c-3634-45f9-914a-0d8e4f425302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.949929 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.954776 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data" (OuterVolumeSpecName: "config-data") pod "b9edcd5c-3634-45f9-914a-0d8e4f425302" (UID: "b9edcd5c-3634-45f9-914a-0d8e4f425302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.978655 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:54.992876 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.000843 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012563 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs\") pod \"8df98dbd-473b-4630-81ab-edd6419feb0d\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012667 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp6cx\" (UniqueName: \"kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx\") pod \"8df98dbd-473b-4630-81ab-edd6419feb0d\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012701 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012725 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012800 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle\") pod \"8df98dbd-473b-4630-81ab-edd6419feb0d\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012843 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqb9d\" (UniqueName: \"kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012863 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012883 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom\") pod \"8df98dbd-473b-4630-81ab-edd6419feb0d\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012915 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data\") pod \"8df98dbd-473b-4630-81ab-edd6419feb0d\" (UID: \"8df98dbd-473b-4630-81ab-edd6419feb0d\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.012995 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013017 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom\") pod \"44461324-fa82-4476-a621-c560a3c89e0f\" (UID: \"44461324-fa82-4476-a621-c560a3c89e0f\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013031 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs" (OuterVolumeSpecName: "logs") pod "8df98dbd-473b-4630-81ab-edd6419feb0d" (UID: "8df98dbd-473b-4630-81ab-edd6419feb0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9w5\" (UniqueName: \"kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts\") pod \"keystone-ab0f-account-create-update-n4gnt\" (UID: \"4a418ad0-7ff0-474b-944a-71418601beb9\") " pod="openstack/keystone-ab0f-account-create-update-n4gnt" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013400 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013410 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013419 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.013433 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df98dbd-473b-4630-81ab-edd6419feb0d-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.013488 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.013544 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:57.013529995 +0000 UTC m=+1553.817747600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : configmap "openstack-scripts" not found Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.018413 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs" (OuterVolumeSpecName: "logs") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.019225 4744 projected.go:194] Error preparing data for projected volume kube-api-access-sc9w5 for pod openstack/keystone-ab0f-account-create-update-n4gnt: failed to fetch token: pod "keystone-ab0f-account-create-update-n4gnt" not found Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.019268 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5 podName:4a418ad0-7ff0-474b-944a-71418601beb9 nodeName:}" failed. No retries permitted until 2026-03-11 01:19:57.019252882 +0000 UTC m=+1553.823470487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sc9w5" (UniqueName: "kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5") pod "keystone-ab0f-account-create-update-n4gnt" (UID: "4a418ad0-7ff0-474b-944a-71418601beb9") : failed to fetch token: pod "keystone-ab0f-account-create-update-n4gnt" not found Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.021226 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.026916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx" (OuterVolumeSpecName: "kube-api-access-zp6cx") pod "8df98dbd-473b-4630-81ab-edd6419feb0d" (UID: "8df98dbd-473b-4630-81ab-edd6419feb0d"). InnerVolumeSpecName "kube-api-access-zp6cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.029629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d" (OuterVolumeSpecName: "kube-api-access-nqb9d") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "kube-api-access-nqb9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.034641 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ab0f-account-create-update-n4gnt"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.035814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.035888 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8df98dbd-473b-4630-81ab-edd6419feb0d" (UID: "8df98dbd-473b-4630-81ab-edd6419feb0d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.053905 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data" (OuterVolumeSpecName: "config-data") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.054141 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.056191 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ab0f-account-create-update-n4gnt"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.068831 4744 scope.go:117] "RemoveContainer" containerID="160c31ddfbf19b31394b583802d9b0b99a645d4e25dc64bc9887035b9c0eac27" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.068851 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8df98dbd-473b-4630-81ab-edd6419feb0d" (UID: "8df98dbd-473b-4630-81ab-edd6419feb0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.079943 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.090073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114279 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114303 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9w5\" (UniqueName: \"kubernetes.io/projected/4a418ad0-7ff0-474b-944a-71418601beb9-kube-api-access-sc9w5\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114316 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44461324-fa82-4476-a621-c560a3c89e0f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114326 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114337 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114348 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp6cx\" (UniqueName: \"kubernetes.io/projected/8df98dbd-473b-4630-81ab-edd6419feb0d-kube-api-access-zp6cx\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114358 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a418ad0-7ff0-474b-944a-71418601beb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114366 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114375 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114383 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114392 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqb9d\" (UniqueName: \"kubernetes.io/projected/44461324-fa82-4476-a621-c560a3c89e0f-kube-api-access-nqb9d\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114400 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.114408 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.126638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.148314 4744 scope.go:117] "RemoveContainer" containerID="55772af813a2f508c881ce1d4bcdd6c3b028b1ce12f20693da7faae23420077f" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.148460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.148463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b9edcd5c-3634-45f9-914a-0d8e4f425302" (UID: "b9edcd5c-3634-45f9-914a-0d8e4f425302"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.155225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data" (OuterVolumeSpecName: "config-data") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.158467 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.172988 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" (UID: "ae4f1d0a-32b7-4ec5-9f5b-a43589af4119"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.187018 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.194069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data" (OuterVolumeSpecName: "config-data") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.206485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.206631 4744 scope.go:117] "RemoveContainer" containerID="0139bf0559237b374402a2f0ca5c12edd7eeaa5764a0ccb62bf6e9201da581b6" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.215600 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle\") pod \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.215642 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom\") pod \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.215680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vvch\" (UniqueName: \"kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch\") pod \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.215721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data\") pod \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.215779 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs\") pod \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\" (UID: \"4fb8af9e-ef1e-45b0-b842-2647fe75510e\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216110 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216127 4744 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9edcd5c-3634-45f9-914a-0d8e4f425302-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216137 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216148 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216157 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216421 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.216457 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.217180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs" (OuterVolumeSpecName: "logs") pod "4fb8af9e-ef1e-45b0-b842-2647fe75510e" (UID: "4fb8af9e-ef1e-45b0-b842-2647fe75510e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.220409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.224763 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch" (OuterVolumeSpecName: "kube-api-access-8vvch") pod "4fb8af9e-ef1e-45b0-b842-2647fe75510e" (UID: "4fb8af9e-ef1e-45b0-b842-2647fe75510e"). InnerVolumeSpecName "kube-api-access-8vvch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.235639 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4fb8af9e-ef1e-45b0-b842-2647fe75510e" (UID: "4fb8af9e-ef1e-45b0-b842-2647fe75510e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.271225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.274944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fb8af9e-ef1e-45b0-b842-2647fe75510e" (UID: "4fb8af9e-ef1e-45b0-b842-2647fe75510e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.274963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f11c8953-d88f-4d37-8366-b0b61606fa8a" (UID: "f11c8953-d88f-4d37-8366-b0b61606fa8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.291837 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data" (OuterVolumeSpecName: "config-data") pod "4767cbee-21c4-4deb-871a-9c6169f5741d" (UID: "4767cbee-21c4-4deb-871a-9c6169f5741d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.309986 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data" (OuterVolumeSpecName: "config-data") pod "44461324-fa82-4476-a621-c560a3c89e0f" (UID: "44461324-fa82-4476-a621-c560a3c89e0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.316460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data" (OuterVolumeSpecName: "config-data") pod "4fb8af9e-ef1e-45b0-b842-2647fe75510e" (UID: "4fb8af9e-ef1e-45b0-b842-2647fe75510e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318809 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318832 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318840 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318849 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vvch\" (UniqueName: \"kubernetes.io/projected/4fb8af9e-ef1e-45b0-b842-2647fe75510e-kube-api-access-8vvch\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318859 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318867 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44461324-fa82-4476-a621-c560a3c89e0f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318875 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fb8af9e-ef1e-45b0-b842-2647fe75510e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318884 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c8953-d88f-4d37-8366-b0b61606fa8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318893 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4767cbee-21c4-4deb-871a-9c6169f5741d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.318900 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb8af9e-ef1e-45b0-b842-2647fe75510e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.329115 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data" (OuterVolumeSpecName: "config-data") pod "8df98dbd-473b-4630-81ab-edd6419feb0d" (UID: "8df98dbd-473b-4630-81ab-edd6419feb0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.354621 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.365064 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6b56953-c881-474c-a21f-4a39102d89ab" (UID: "a6b56953-c881-474c-a21f-4a39102d89ab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.421711 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.421743 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6b56953-c881-474c-a21f-4a39102d89ab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.421754 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df98dbd-473b-4630-81ab-edd6419feb0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.451279 4744 scope.go:117] "RemoveContainer" containerID="d932b2416d71a86f80ef3581b5216ce6ba10ab543bf647b40daeebe0c83edbaa" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.491057 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.493480 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.497106 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.506105 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.510026 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.522114 4744 scope.go:117] "RemoveContainer" containerID="8485a4680d7e9bd7479321ad2c60fbdd63c7941a99f12f5159677293b38ca6db" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.522965 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle\") pod \"730c901d-c3c5-46c5-b618-00cdcc17bef2\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.523074 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs\") pod \"730c901d-c3c5-46c5-b618-00cdcc17bef2\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.523177 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data\") pod \"730c901d-c3c5-46c5-b618-00cdcc17bef2\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.523329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8djm\" (UniqueName: \"kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm\") pod \"730c901d-c3c5-46c5-b618-00cdcc17bef2\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.524080 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config\") pod \"730c901d-c3c5-46c5-b618-00cdcc17bef2\" (UID: \"730c901d-c3c5-46c5-b618-00cdcc17bef2\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.524130 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data" (OuterVolumeSpecName: "config-data") pod "730c901d-c3c5-46c5-b618-00cdcc17bef2" (UID: "730c901d-c3c5-46c5-b618-00cdcc17bef2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.524498 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "730c901d-c3c5-46c5-b618-00cdcc17bef2" (UID: "730c901d-c3c5-46c5-b618-00cdcc17bef2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.524681 4744 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.524770 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts podName:7f3aa5cc-eae2-4d60-96cf-6d847a5599ec nodeName:}" failed. No retries permitted until 2026-03-11 01:19:57.524756487 +0000 UTC m=+1554.328974092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts") pod "root-account-create-update-rplmm" (UID: "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec") : configmap "openstack-scripts" not found Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.528776 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm" (OuterVolumeSpecName: "kube-api-access-c8djm") pod "730c901d-c3c5-46c5-b618-00cdcc17bef2" (UID: "730c901d-c3c5-46c5-b618-00cdcc17bef2"). InnerVolumeSpecName "kube-api-access-c8djm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.569542 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.574412 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.585125 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.602227 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.619034 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.619240 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.619470 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 is running failed: container process not found" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.619942 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 is running failed: container process not found" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.627094 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "730c901d-c3c5-46c5-b618-00cdcc17bef2" (UID: "730c901d-c3c5-46c5-b618-00cdcc17bef2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.628012 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8djm\" (UniqueName: \"kubernetes.io/projected/730c901d-c3c5-46c5-b618-00cdcc17bef2-kube-api-access-c8djm\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.628035 4744 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.628044 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.628052 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730c901d-c3c5-46c5-b618-00cdcc17bef2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.636568 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.645737 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 is running failed: container process not found" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.645813 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="ovn-northd" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.646088 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.649681 4744 scope.go:117] "RemoveContainer" containerID="d169bd1e2373ec6a70d9c8ea39075ed35856b33455affec0b49a8f185690d2f6" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.671823 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.674159 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "730c901d-c3c5-46c5-b618-00cdcc17bef2" (UID: "730c901d-c3c5-46c5-b618-00cdcc17bef2"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.708319 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.719650 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.733763 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959b9\" (UniqueName: \"kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9\") pod \"c8caed76-baba-4ad3-b95a-e428132f2021\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.733822 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle\") pod \"c8caed76-baba-4ad3-b95a-e428132f2021\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.733923 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data\") pod \"c8caed76-baba-4ad3-b95a-e428132f2021\" (UID: \"c8caed76-baba-4ad3-b95a-e428132f2021\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.734458 4744 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/730c901d-c3c5-46c5-b618-00cdcc17bef2-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.739538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9" (OuterVolumeSpecName: "kube-api-access-959b9") pod "c8caed76-baba-4ad3-b95a-e428132f2021" (UID: "c8caed76-baba-4ad3-b95a-e428132f2021"). InnerVolumeSpecName "kube-api-access-959b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.763970 4744 scope.go:117] "RemoveContainer" containerID="45fbeb5c4f5cda1cead9ba249d13f0bd4d1407bc16d3e8ba3521b8e0f14fca3f" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.776066 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8caed76-baba-4ad3-b95a-e428132f2021" (UID: "c8caed76-baba-4ad3-b95a-e428132f2021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.776135 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f78d57d44-gt8df"] Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.829653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data" (OuterVolumeSpecName: "config-data") pod "c8caed76-baba-4ad3-b95a-e428132f2021" (UID: "c8caed76-baba-4ad3-b95a-e428132f2021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.833419 4744 scope.go:117] "RemoveContainer" containerID="5be8527746802ca45e3d648c31b7d6bd21a5227a4e2fd3aeb4816ee96bea245a" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.843311 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts\") pod \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.843384 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9b57\" (UniqueName: \"kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57\") pod \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\" (UID: \"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec\") " Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.843743 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959b9\" (UniqueName: \"kubernetes.io/projected/c8caed76-baba-4ad3-b95a-e428132f2021-kube-api-access-959b9\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.843759 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.843768 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8caed76-baba-4ad3-b95a-e428132f2021-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.843831 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.843876 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data podName:fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9 nodeName:}" failed. No retries permitted until 2026-03-11 01:20:03.843861483 +0000 UTC m=+1560.648079088 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data") pod "rabbitmq-server-0" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9") : configmap "rabbitmq-config-data" not found Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.847405 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" (UID: "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.847765 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57" (OuterVolumeSpecName: "kube-api-access-m9b57") pod "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" (UID: "7f3aa5cc-eae2-4d60-96cf-6d847a5599ec"). InnerVolumeSpecName "kube-api-access-m9b57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.881817 4744 scope.go:117] "RemoveContainer" containerID="dda45a2a6beb4d51c68422dc1406aaa80a32f7291668639a011f39ec42b2fc96" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.899207 4744 scope.go:117] "RemoveContainer" containerID="c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.920898 4744 scope.go:117] "RemoveContainer" containerID="9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.930388 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f/ovn-northd/0.log" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.930455 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.945743 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.945800 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9b57\" (UniqueName: \"kubernetes.io/projected/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec-kube-api-access-m9b57\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.951670 4744 scope.go:117] "RemoveContainer" containerID="c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.952300 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9\": container with ID starting with c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9 not found: ID does not exist" containerID="c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.952347 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9"} err="failed to get container status \"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9\": rpc error: code = NotFound desc = could not find container \"c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9\": container with ID starting with c2b142e64db7ed4cb71b1ebbbbe9200442217ded53b3157e1c635748af055ca9 not found: ID does not exist" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.952373 4744 scope.go:117] "RemoveContainer" containerID="9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad" Mar 11 01:19:55 crc kubenswrapper[4744]: E0311 01:19:55.956822 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad\": container with ID starting with 9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad not found: ID does not exist" containerID="9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.956852 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad"} err="failed to get container status \"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad\": rpc error: code = NotFound desc = could not find container \"9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad\": container with ID starting with 9aea2a3292e66ea11857925fc0390ee0f713bcd4e833fe65a989d82320b82cad not found: ID does not exist" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.957205 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"730c901d-c3c5-46c5-b618-00cdcc17bef2","Type":"ContainerDied","Data":"1b26f542b992a0c808fd6085d2325d9b6aa8290cf4c2b4d0581787fb7abed8f0"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.957257 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.957292 4744 scope.go:117] "RemoveContainer" containerID="041f61378d1b6288e78cda19110d0e5a7b5f6590d353d62737e351291c23ef4b" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.963440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c68976bb4-299gh" event={"ID":"44461324-fa82-4476-a621-c560a3c89e0f","Type":"ContainerDied","Data":"cd2685be8c2515724b30ea52e7364dd64a1e93a5cdd26d77f671fc74d691afb1"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.963619 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c68976bb4-299gh" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.968985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rplmm" event={"ID":"7f3aa5cc-eae2-4d60-96cf-6d847a5599ec","Type":"ContainerDied","Data":"68f27e12070e3e9f4e8ddf9e7621e4bd7c6c7dea2fa37e86cf38abbb967764f7"} Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.969076 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rplmm" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.979010 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bc7567ff7-gl658" Mar 11 01:19:55 crc kubenswrapper[4744]: I0311 01:19:55.981577 4744 scope.go:117] "RemoveContainer" containerID="347f55903065cc04c7bddba453ffd2605f2f89f0b3744727e7f308c9383fd8e9" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.013961 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.021850 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" path="/var/lib/kubelet/pods/23fcfdba-12bc-4a94-94cd-fb703f2e632c/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.022609 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" path="/var/lib/kubelet/pods/4767cbee-21c4-4deb-871a-9c6169f5741d/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.025605 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f/ovn-northd/0.log" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.025819 4744 generic.go:334] "Generic (PLEG): container finished" podID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" exitCode=139 Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.025962 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.026087 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7647d7b844-j6gcn" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.026096 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnrv8" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="registry-server" containerID="cri-o://7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a" gracePeriod=2 Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.026959 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a418ad0-7ff0-474b-944a-71418601beb9" path="/var/lib/kubelet/pods/4a418ad0-7ff0-474b-944a-71418601beb9/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.027356 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" path="/var/lib/kubelet/pods/a6b56953-c881-474c-a21f-4a39102d89ab/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.028002 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" path="/var/lib/kubelet/pods/ae4f1d0a-32b7-4ec5-9f5b-a43589af4119/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.030060 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" path="/var/lib/kubelet/pods/b9edcd5c-3634-45f9-914a-0d8e4f425302/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.031769 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc403516-137f-4bfb-badf-89b13ff0468f" path="/var/lib/kubelet/pods/cc403516-137f-4bfb-badf-89b13ff0468f/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.032327 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" path="/var/lib/kubelet/pods/f11c8953-d88f-4d37-8366-b0b61606fa8a/volumes" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.041623 4744 scope.go:117] "RemoveContainer" containerID="eb9d234dbc5c2b62e281936ae9110a93aa09d1211656f0987d1551eeec652ae8" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046419 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbmgd\" (UniqueName: \"kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046495 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046624 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046644 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs\") pod \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\" (UID: \"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046691 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts" (OuterVolumeSpecName: "scripts") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.046927 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.048370 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.048973 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config" (OuterVolumeSpecName: "config") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.066156 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd" (OuterVolumeSpecName: "kube-api-access-zbmgd") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "kube-api-access-zbmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.082674 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.108305 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.119107 4744 scope.go:117] "RemoveContainer" containerID="b93a05d9c05b761944f0a54e2a10841bc55777d4b1e2fe855855e9122f7a3fdb" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122664 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122710 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c68976bb4-299gh"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122728 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bc7567ff7-gl658" event={"ID":"4fb8af9e-ef1e-45b0-b842-2647fe75510e","Type":"ContainerDied","Data":"00cd22cfbd7690b23506a55b55085f3deb61172a61c332f09aad3e95bb5e8a18"} Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8caed76-baba-4ad3-b95a-e428132f2021","Type":"ContainerDied","Data":"4066205a4d4f854a933e6ee6717e3772cc1b97e03da805734227a2224bdc1194"} Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerDied","Data":"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202"} Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.122795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f","Type":"ContainerDied","Data":"bbcdff835555aeb6b4dbb296e3fd0876b11930670cbee761696cab7a332be634"} Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.124169 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" (UID: "4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148190 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148217 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148250 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148259 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148268 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.148279 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbmgd\" (UniqueName: \"kubernetes.io/projected/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f-kube-api-access-zbmgd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.321730 4744 scope.go:117] "RemoveContainer" containerID="9a1f61db53cc92beec4b876c5f655cbd7b0389b0b62b4ff5bc3cc4d0c15ec01a" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.352589 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.356156 4744 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.356210 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data podName:714c91e5-04c5-4f95-97e3-a3c08664944d nodeName:}" failed. No retries permitted until 2026-03-11 01:20:04.356197078 +0000 UTC m=+1561.160414683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data") pod "rabbitmq-cell1-server-0" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d") : configmap "rabbitmq-cell1-config-data" not found Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.356948 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rplmm"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.368664 4744 scope.go:117] "RemoveContainer" containerID="0f15de8f909facf4d53d7ef48aa1e9dab867da2d0c8b16f8059f69e45a208436" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.375077 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.382094 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.390055 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.393775 4744 scope.go:117] "RemoveContainer" containerID="a7377528de2e5a9a7450056f3a281f5a1113ad08b0bc9f7c476cd009942572ee" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.396614 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7647d7b844-j6gcn"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.405240 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.411111 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.418441 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.418885 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.213:3000/\": dial tcp 10.217.0.213:3000: connect: connection refused" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.421701 4744 scope.go:117] "RemoveContainer" containerID="e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.424367 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.437599 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.440761 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-bc7567ff7-gl658"] Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.456669 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.456804 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6bb49fdf95-9g9dz" podUID="d60ef156-5767-43c1-bb0b-a8c681a8a6be" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.494730 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.500354 4744 scope.go:117] "RemoveContainer" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.529448 4744 scope.go:117] "RemoveContainer" containerID="e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b" Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.543171 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b\": container with ID starting with e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b not found: ID does not exist" containerID="e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.543221 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b"} err="failed to get container status \"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b\": rpc error: code = NotFound desc = could not find container \"e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b\": container with ID starting with e79f185b18de9324d312d6637694c9a5d88669a575ff2e95d23425a83f97b67b not found: ID does not exist" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.543249 4744 scope.go:117] "RemoveContainer" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.543561 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202\": container with ID starting with 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 not found: ID does not exist" containerID="4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.543582 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202"} err="failed to get container status \"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202\": rpc error: code = NotFound desc = could not find container \"4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202\": container with ID starting with 4313c32f70e334310d9c7d0fb323d92eab706c09609f6e86a47d81e663224202 not found: ID does not exist" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.559220 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sqhr\" (UniqueName: \"kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr\") pod \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.559385 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities\") pod \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.559454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content\") pod \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\" (UID: \"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8\") " Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.562150 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities" (OuterVolumeSpecName: "utilities") pod "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" (UID: "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.569735 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr" (OuterVolumeSpecName: "kube-api-access-7sqhr") pod "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" (UID: "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8"). InnerVolumeSpecName "kube-api-access-7sqhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.661505 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sqhr\" (UniqueName: \"kubernetes.io/projected/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-kube-api-access-7sqhr\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.661551 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.694432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" (UID: "03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:56 crc kubenswrapper[4744]: I0311 01:19:56.763675 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.989568 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.989650 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.990133 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.990902 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.992441 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.992476 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.992490 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:19:56 crc kubenswrapper[4744]: E0311 01:19:56.992533 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.049155 4744 generic.go:334] "Generic (PLEG): container finished" podID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerID="364c3fa121dddb8ffc883e3753ab0a397a68bab1a3f4f81bef538a6bf6da07b9" exitCode=0 Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.049234 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerDied","Data":"364c3fa121dddb8ffc883e3753ab0a397a68bab1a3f4f81bef538a6bf6da07b9"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.049989 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.050032 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.051472 4744 generic.go:334] "Generic (PLEG): container finished" podID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerID="7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a" exitCode=0 Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.051536 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnrv8" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.051559 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerDied","Data":"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.051587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnrv8" event={"ID":"03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8","Type":"ContainerDied","Data":"6791e2dbcf7890a4ea4e236b98c40266f3bf25278f27196841bea8157595304e"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.051607 4744 scope.go:117] "RemoveContainer" containerID="7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.057736 4744 generic.go:334] "Generic (PLEG): container finished" podID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerID="2d7e9342156b6a7e0b5782247ec6e299cdb60a6da7997fe5146c00f779c615e6" exitCode=0 Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.057821 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerDied","Data":"2d7e9342156b6a7e0b5782247ec6e299cdb60a6da7997fe5146c00f779c615e6"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.059930 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerID="f517c72839363553e8b786dec7b9824c28bc5f5e37822956cd45b454cd30e224" exitCode=0 Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.059980 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerDied","Data":"f517c72839363553e8b786dec7b9824c28bc5f5e37822956cd45b454cd30e224"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.064759 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" containerID="a5f82d457fa59f44d576003aefd017ef2285b45e6c69abd14e7eb7f7df02fc09" exitCode=0 Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.064837 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cc5cb746-kmb4g" event={"ID":"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1","Type":"ContainerDied","Data":"a5f82d457fa59f44d576003aefd017ef2285b45e6c69abd14e7eb7f7df02fc09"} Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.135852 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.135838 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.152733 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.152759 4744 scope.go:117] "RemoveContainer" containerID="2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.152925 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.153382 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.158040 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnrv8"] Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.159053 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.159107 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.173874 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.174097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.174254 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.174328 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.174396 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.174465 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.175675 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl7w2\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.175791 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.175879 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.175958 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.176027 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd\") pod \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\" (UID: \"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.176500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.177542 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.178806 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.181141 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.184311 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.184492 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2" (OuterVolumeSpecName: "kube-api-access-nl7w2") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "kube-api-access-nl7w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.190244 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.211502 4744 scope.go:117] "RemoveContainer" containerID="6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.235467 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data" (OuterVolumeSpecName: "config-data") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.239929 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info" (OuterVolumeSpecName: "pod-info") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.240240 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.255176 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf" (OuterVolumeSpecName: "server-conf") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278824 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278877 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278892 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278901 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278910 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278919 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278927 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278935 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl7w2\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-kube-api-access-nl7w2\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.278944 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.293319 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.299629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" (UID: "fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.361210 4744 scope.go:117] "RemoveContainer" containerID="7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a" Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.362324 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a\": container with ID starting with 7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a not found: ID does not exist" containerID="7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.362381 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a"} err="failed to get container status \"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a\": rpc error: code = NotFound desc = could not find container \"7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a\": container with ID starting with 7fe48392686fc634dabbfe9990397d2cb61cef30311e93bee85265aa5ef9c32a not found: ID does not exist" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.362412 4744 scope.go:117] "RemoveContainer" containerID="2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f" Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.362833 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f\": container with ID starting with 2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f not found: ID does not exist" containerID="2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.362873 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f"} err="failed to get container status \"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f\": rpc error: code = NotFound desc = could not find container \"2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f\": container with ID starting with 2bedcd4d8553a08b966348929afde0364ddec647eddeed161dba7e9f9765b64f not found: ID does not exist" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.362921 4744 scope.go:117] "RemoveContainer" containerID="6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc" Mar 11 01:19:57 crc kubenswrapper[4744]: E0311 01:19:57.369015 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc\": container with ID starting with 6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc not found: ID does not exist" containerID="6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.369073 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc"} err="failed to get container status \"6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc\": rpc error: code = NotFound desc = could not find container \"6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc\": container with ID starting with 6c5c4ca7e44bce16a98eebd9c3a855e4b799a3c001bf897d0cecf3bf6bb06fdc not found: ID does not exist" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.380735 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.380853 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.385295 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.391443 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482166 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482192 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkczr\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482217 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482242 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxgt6\" (UniqueName: \"kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482260 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482278 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482298 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482344 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482377 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482394 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482449 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482472 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482495 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482523 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482564 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated\") pod \"6b383e05-0440-49ee-8add-708ea04e9ce7\" (UID: \"6b383e05-0440-49ee-8add-708ea04e9ce7\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.482601 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins\") pod \"714c91e5-04c5-4f95-97e3-a3c08664944d\" (UID: \"714c91e5-04c5-4f95-97e3-a3c08664944d\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.483149 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.483537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.485334 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.485759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.486144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.486582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.487670 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.488809 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.489639 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.490566 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.498748 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6" (OuterVolumeSpecName: "kube-api-access-rxgt6") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "kube-api-access-rxgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.502549 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr" (OuterVolumeSpecName: "kube-api-access-zkczr") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "kube-api-access-zkczr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.517406 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.519158 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info" (OuterVolumeSpecName: "pod-info") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.543815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.552314 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data" (OuterVolumeSpecName: "config-data") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.570760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6b383e05-0440-49ee-8add-708ea04e9ce7" (UID: "6b383e05-0440-49ee-8add-708ea04e9ce7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.573212 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf" (OuterVolumeSpecName: "server-conf") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584075 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584101 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584112 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584124 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584132 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/714c91e5-04c5-4f95-97e3-a3c08664944d-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584141 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584151 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584159 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b383e05-0440-49ee-8add-708ea04e9ce7-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584168 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584176 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584190 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584199 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkczr\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-kube-api-access-zkczr\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584207 4744 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b383e05-0440-49ee-8add-708ea04e9ce7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584216 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxgt6\" (UniqueName: \"kubernetes.io/projected/6b383e05-0440-49ee-8add-708ea04e9ce7-kube-api-access-rxgt6\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584225 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584234 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/714c91e5-04c5-4f95-97e3-a3c08664944d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584242 4744 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b383e05-0440-49ee-8add-708ea04e9ce7-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.584250 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/714c91e5-04c5-4f95-97e3-a3c08664944d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.600488 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.600785 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.620591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "714c91e5-04c5-4f95-97e3-a3c08664944d" (UID: "714c91e5-04c5-4f95-97e3-a3c08664944d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.685395 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.685438 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/714c91e5-04c5-4f95-97e3-a3c08664944d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.685452 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.733211 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.786804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.786873 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.786919 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.786941 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.787011 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.787027 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.787052 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75lzm\" (UniqueName: \"kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.787069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs\") pod \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\" (UID: \"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1\") " Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.791007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts" (OuterVolumeSpecName: "scripts") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.793003 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.796679 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm" (OuterVolumeSpecName: "kube-api-access-75lzm") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "kube-api-access-75lzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.803946 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.807647 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.818583 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data" (OuterVolumeSpecName: "config-data") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.831353 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.845999 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" (UID: "2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888431 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888467 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888477 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888486 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888494 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888503 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75lzm\" (UniqueName: \"kubernetes.io/projected/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-kube-api-access-75lzm\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888527 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.888535 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.982634 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" path="/var/lib/kubelet/pods/03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.983244 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44461324-fa82-4476-a621-c560a3c89e0f" path="/var/lib/kubelet/pods/44461324-fa82-4476-a621-c560a3c89e0f/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.983843 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" path="/var/lib/kubelet/pods/4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.984839 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" path="/var/lib/kubelet/pods/4fb8af9e-ef1e-45b0-b842-2647fe75510e/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.985419 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730c901d-c3c5-46c5-b618-00cdcc17bef2" path="/var/lib/kubelet/pods/730c901d-c3c5-46c5-b618-00cdcc17bef2/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.986262 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" path="/var/lib/kubelet/pods/7f3aa5cc-eae2-4d60-96cf-6d847a5599ec/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.986942 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" path="/var/lib/kubelet/pods/8df98dbd-473b-4630-81ab-edd6419feb0d/volumes" Mar 11 01:19:57 crc kubenswrapper[4744]: I0311 01:19:57.987439 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" path="/var/lib/kubelet/pods/c8caed76-baba-4ad3-b95a-e428132f2021/volumes" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.160603 4744 generic.go:334] "Generic (PLEG): container finished" podID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" exitCode=0 Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.160865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62ac51a-a222-4e7b-b465-9e71c3d34b1f","Type":"ContainerDied","Data":"fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.170536 4744 generic.go:334] "Generic (PLEG): container finished" podID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerID="1252cd0d658ee3e2e06be9771503f0ab0664f8d814b22e653a029a2d6d6716c4" exitCode=0 Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.170585 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerDied","Data":"1252cd0d658ee3e2e06be9771503f0ab0664f8d814b22e653a029a2d6d6716c4"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.171865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b383e05-0440-49ee-8add-708ea04e9ce7","Type":"ContainerDied","Data":"d87191ac09c3db8a6f76e7da327659cedbea082e3b6af5f2072d6a56758695f2"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.171893 4744 scope.go:117] "RemoveContainer" containerID="364c3fa121dddb8ffc883e3753ab0a397a68bab1a3f4f81bef538a6bf6da07b9" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.171992 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.194707 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.194716 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"714c91e5-04c5-4f95-97e3-a3c08664944d","Type":"ContainerDied","Data":"b08670cddf5cc4a5ba911f41e6483fd3df91a1dd8c03560914da4bff08006f78"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.201586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9","Type":"ContainerDied","Data":"fd34d5b21cf096881733f60c0d418472d3d5afcd53181014727879c731dbaa67"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.201692 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.207966 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cc5cb746-kmb4g" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.208456 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cc5cb746-kmb4g" event={"ID":"2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1","Type":"ContainerDied","Data":"53604dfa6ef728013e5531a56d6c6b91cb1becd895524058d82c2d5522b089d0"} Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.225784 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.232811 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.244772 4744 scope.go:117] "RemoveContainer" containerID="f17d6132fb9aad59781a44370c2084f529634cf0ee10583babf1eb3b469f6924" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.272541 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.354338 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.360153 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.363093 4744 scope.go:117] "RemoveContainer" containerID="2d7e9342156b6a7e0b5782247ec6e299cdb60a6da7997fe5146c00f779c615e6" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.377900 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.380418 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.401075 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.413090 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.420892 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7cc5cb746-kmb4g"] Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.426387 4744 scope.go:117] "RemoveContainer" containerID="e724fad610e3cb354b224dbc23638db68990df9c737ed272890fd1779688fc45" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.443747 4744 scope.go:117] "RemoveContainer" containerID="f517c72839363553e8b786dec7b9824c28bc5f5e37822956cd45b454cd30e224" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.479448 4744 scope.go:117] "RemoveContainer" containerID="dd9f74256d9d36d7d93b0c687c50126a3012a98632ea369b61ca6eb2ada71f31" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503171 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503275 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503306 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503345 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503387 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbr42\" (UniqueName: \"kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.503434 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs\") pod \"33c0b5dd-192a-4fd2-bbaa-b483399724df\" (UID: \"33c0b5dd-192a-4fd2-bbaa-b483399724df\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.504761 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.504886 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.519695 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts" (OuterVolumeSpecName: "scripts") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.519747 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42" (OuterVolumeSpecName: "kube-api-access-cbr42") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "kube-api-access-cbr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.523291 4744 scope.go:117] "RemoveContainer" containerID="a5f82d457fa59f44d576003aefd017ef2285b45e6c69abd14e7eb7f7df02fc09" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.523943 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.544089 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.577593 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.585791 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.611617 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data" (OuterVolumeSpecName: "config-data") pod "33c0b5dd-192a-4fd2-bbaa-b483399724df" (UID: "33c0b5dd-192a-4fd2-bbaa-b483399724df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612602 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612640 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612652 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612662 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612674 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c0b5dd-192a-4fd2-bbaa-b483399724df-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612686 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbr42\" (UniqueName: \"kubernetes.io/projected/33c0b5dd-192a-4fd2-bbaa-b483399724df-kube-api-access-cbr42\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612696 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.612706 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c0b5dd-192a-4fd2-bbaa-b483399724df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.620184 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data\") pod \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713570 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data\") pod \"05c279dc-d915-4688-b2c2-c43ff96ad81c\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713640 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkrs\" (UniqueName: \"kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs\") pod \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713683 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khdhz\" (UniqueName: \"kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz\") pod \"05c279dc-d915-4688-b2c2-c43ff96ad81c\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle\") pod \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\" (UID: \"b62ac51a-a222-4e7b-b465-9e71c3d34b1f\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.713751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle\") pod \"05c279dc-d915-4688-b2c2-c43ff96ad81c\" (UID: \"05c279dc-d915-4688-b2c2-c43ff96ad81c\") " Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.717149 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz" (OuterVolumeSpecName: "kube-api-access-khdhz") pod "05c279dc-d915-4688-b2c2-c43ff96ad81c" (UID: "05c279dc-d915-4688-b2c2-c43ff96ad81c"). InnerVolumeSpecName "kube-api-access-khdhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.717802 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs" (OuterVolumeSpecName: "kube-api-access-zfkrs") pod "b62ac51a-a222-4e7b-b465-9e71c3d34b1f" (UID: "b62ac51a-a222-4e7b-b465-9e71c3d34b1f"). InnerVolumeSpecName "kube-api-access-zfkrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.730434 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data" (OuterVolumeSpecName: "config-data") pod "b62ac51a-a222-4e7b-b465-9e71c3d34b1f" (UID: "b62ac51a-a222-4e7b-b465-9e71c3d34b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.731504 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data" (OuterVolumeSpecName: "config-data") pod "05c279dc-d915-4688-b2c2-c43ff96ad81c" (UID: "05c279dc-d915-4688-b2c2-c43ff96ad81c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.739608 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05c279dc-d915-4688-b2c2-c43ff96ad81c" (UID: "05c279dc-d915-4688-b2c2-c43ff96ad81c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.741419 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b62ac51a-a222-4e7b-b465-9e71c3d34b1f" (UID: "b62ac51a-a222-4e7b-b465-9e71c3d34b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815303 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815529 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815540 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkrs\" (UniqueName: \"kubernetes.io/projected/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-kube-api-access-zfkrs\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815552 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khdhz\" (UniqueName: \"kubernetes.io/projected/05c279dc-d915-4688-b2c2-c43ff96ad81c-kube-api-access-khdhz\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815560 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62ac51a-a222-4e7b-b465-9e71c3d34b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:58 crc kubenswrapper[4744]: I0311 01:19:58.815571 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c279dc-d915-4688-b2c2-c43ff96ad81c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.236182 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62ac51a-a222-4e7b-b465-9e71c3d34b1f","Type":"ContainerDied","Data":"6344ff515d5d03e1c58a3b38592f19215b4ffbf6761bba85ef02c15059c6f620"} Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.236232 4744 scope.go:117] "RemoveContainer" containerID="fd3a94b5310ccf7a9c7dd7207db3d75ffedf02fcd045db30b5914808a12e1cc6" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.236303 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.239098 4744 generic.go:334] "Generic (PLEG): container finished" podID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" exitCode=0 Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.239178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05c279dc-d915-4688-b2c2-c43ff96ad81c","Type":"ContainerDied","Data":"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a"} Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.239202 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05c279dc-d915-4688-b2c2-c43ff96ad81c","Type":"ContainerDied","Data":"dd4cb4f38a20994d21f06ee2df11baf41bc2a4c201f5816dee201c234f762b5e"} Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.239235 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.244095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c0b5dd-192a-4fd2-bbaa-b483399724df","Type":"ContainerDied","Data":"723b7d781dd11751a0df17474f1ed7775b80a5f78ad3d6efc2754e757cce5b54"} Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.244184 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.285963 4744 scope.go:117] "RemoveContainer" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.367479 4744 scope.go:117] "RemoveContainer" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.382927 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.387958 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.402391 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.408051 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: E0311 01:19:59.414358 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a\": container with ID starting with 9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a not found: ID does not exist" containerID="9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.414428 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a"} err="failed to get container status \"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a\": rpc error: code = NotFound desc = could not find container \"9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a\": container with ID starting with 9fbe6aaee4ddc8846bfb132b1b5d6d6e18bbaed4c0eefbf91efd571ce47a8e6a not found: ID does not exist" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.414470 4744 scope.go:117] "RemoveContainer" containerID="2e90fe156899af91f60cc58374a4704b928fb93dc5a2b8a016f190e0a0897fe6" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.438968 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.449368 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.453539 4744 scope.go:117] "RemoveContainer" containerID="3d1ff7ea2ca1a4a7692ee87d6f5ba883dc249a72e783a373c386078d938baf27" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.479888 4744 scope.go:117] "RemoveContainer" containerID="1252cd0d658ee3e2e06be9771503f0ab0664f8d814b22e653a029a2d6d6716c4" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.516608 4744 scope.go:117] "RemoveContainer" containerID="6940392b1cc755e200810e7dda5a0cdf602c2bc1bee93993b5d7b849c244decb" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.575263 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.992832 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" path="/var/lib/kubelet/pods/05c279dc-d915-4688-b2c2-c43ff96ad81c/volumes" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.994078 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" path="/var/lib/kubelet/pods/2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1/volumes" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.995129 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" path="/var/lib/kubelet/pods/33c0b5dd-192a-4fd2-bbaa-b483399724df/volumes" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.997740 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" path="/var/lib/kubelet/pods/6b383e05-0440-49ee-8add-708ea04e9ce7/volumes" Mar 11 01:19:59 crc kubenswrapper[4744]: I0311 01:19:59.999636 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" path="/var/lib/kubelet/pods/714c91e5-04c5-4f95-97e3-a3c08664944d/volumes" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.001721 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" path="/var/lib/kubelet/pods/b62ac51a-a222-4e7b-b465-9e71c3d34b1f/volumes" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.003157 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" path="/var/lib/kubelet/pods/fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9/volumes" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146345 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553200-7v85t"] Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146813 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146841 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-api" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146859 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146872 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146890 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146903 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-api" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146920 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="extract-utilities" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146933 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="extract-utilities" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146948 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146960 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.146979 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.146990 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147014 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147026 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147046 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-notification-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147059 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-notification-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147072 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="setup-container" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147085 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="setup-container" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147104 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="openstack-network-exporter" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147115 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="openstack-network-exporter" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147138 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-central-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147150 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-central-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147173 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147184 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147200 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="galera" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147212 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="galera" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147231 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147243 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147260 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147271 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147290 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730c901d-c3c5-46c5-b618-00cdcc17bef2" containerName="memcached" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147302 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="730c901d-c3c5-46c5-b618-00cdcc17bef2" containerName="memcached" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147325 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147337 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147351 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" containerName="nova-cell0-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147363 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" containerName="nova-cell0-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147381 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147392 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147415 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147426 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147444 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc403516-137f-4bfb-badf-89b13ff0468f" containerName="kube-state-metrics" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147456 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc403516-137f-4bfb-badf-89b13ff0468f" containerName="kube-state-metrics" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147470 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147481 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147542 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147556 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147568 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147580 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147593 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="ovn-northd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147604 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="ovn-northd" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147618 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="setup-container" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147629 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="setup-container" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147643 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147655 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147670 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147682 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147699 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147711 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147723 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147734 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147756 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="extract-content" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147771 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="extract-content" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147793 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147805 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147824 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147835 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147849 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147862 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147885 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="mysql-bootstrap" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147897 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="mysql-bootstrap" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147910 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" containerName="keystone-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147921 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" containerName="keystone-api" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147942 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="sg-core" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147954 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="sg-core" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.147976 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="proxy-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.147988 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="proxy-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.148006 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148018 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.148035 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148046 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: E0311 01:20:00.148065 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="registry-server" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148077 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="registry-server" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148343 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148365 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb920bf7-eae5-4f7a-9af7-bde85bfb4ee9" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148378 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="730c901d-c3c5-46c5-b618-00cdcc17bef2" containerName="memcached" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148401 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8caed76-baba-4ad3-b95a-e428132f2021" containerName="nova-cell0-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148417 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148429 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148446 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-notification-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148459 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11c8953-d88f-4d37-8366-b0b61606fa8a" containerName="cinder-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148473 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148493 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148536 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="ceilometer-central-agent" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148554 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b383e05-0440-49ee-8add-708ea04e9ce7" containerName="galera" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148575 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df98dbd-473b-4630-81ab-edd6419feb0d" containerName="barbican-keystone-listener-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148587 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148603 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc403516-137f-4bfb-badf-89b13ff0468f" containerName="kube-state-metrics" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148620 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="openstack-network-exporter" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148633 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="714c91e5-04c5-4f95-97e3-a3c08664944d" containerName="rabbitmq" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148649 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148665 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148682 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148861 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148876 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="sg-core" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148893 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb8af9e-ef1e-45b0-b842-2647fe75510e" containerName="barbican-worker" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148908 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62ac51a-a222-4e7b-b465-9e71c3d34b1f" containerName="nova-scheduler-scheduler" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148933 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="44461324-fa82-4476-a621-c560a3c89e0f" containerName="barbican-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148952 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b56953-c881-474c-a21f-4a39102d89ab" containerName="placement-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148966 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9edcd5c-3634-45f9-914a-0d8e4f425302" containerName="nova-metadata-metadata" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.148985 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee4fd0f-1e90-4771-bca2-2eb17df0b0b1" containerName="keystone-api" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149000 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4f1d0a-32b7-4ec5-9f5b-a43589af4119" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149019 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4767cbee-21c4-4deb-871a-9c6169f5741d" containerName="glance-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149035 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fcfdba-12bc-4a94-94cd-fb703f2e632c" containerName="nova-api-log" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149058 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b55bf1-d3fe-4ba9-bc5b-3ac4cf87e4f8" containerName="registry-server" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149087 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c0b5dd-192a-4fd2-bbaa-b483399724df" containerName="proxy-httpd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149114 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c279dc-d915-4688-b2c2-c43ff96ad81c" containerName="nova-cell1-conductor-conductor" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149135 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3aa5cc-eae2-4d60-96cf-6d847a5599ec" containerName="mariadb-account-create-update" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.149150 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f48c5b8-9cda-4c2c-9244-9bb71b9dc05f" containerName="ovn-northd" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.150185 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.156506 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.156600 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.157121 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.167563 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553200-7v85t"] Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.246160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s47vz\" (UniqueName: \"kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz\") pod \"auto-csr-approver-29553200-7v85t\" (UID: \"c5c9f281-5118-4349-bd48-43287ffb8059\") " pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.270642 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mw9fv" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="registry-server" containerID="cri-o://efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb" gracePeriod=2 Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.347925 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s47vz\" (UniqueName: \"kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz\") pod \"auto-csr-approver-29553200-7v85t\" (UID: \"c5c9f281-5118-4349-bd48-43287ffb8059\") " pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.396989 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s47vz\" (UniqueName: \"kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz\") pod \"auto-csr-approver-29553200-7v85t\" (UID: \"c5c9f281-5118-4349-bd48-43287ffb8059\") " pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.480774 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.848782 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.962192 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n27k\" (UniqueName: \"kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k\") pod \"e910960b-a434-4830-b4be-96571fa4dd54\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.962278 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content\") pod \"e910960b-a434-4830-b4be-96571fa4dd54\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.962467 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities\") pod \"e910960b-a434-4830-b4be-96571fa4dd54\" (UID: \"e910960b-a434-4830-b4be-96571fa4dd54\") " Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.964681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities" (OuterVolumeSpecName: "utilities") pod "e910960b-a434-4830-b4be-96571fa4dd54" (UID: "e910960b-a434-4830-b4be-96571fa4dd54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.977177 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k" (OuterVolumeSpecName: "kube-api-access-4n27k") pod "e910960b-a434-4830-b4be-96571fa4dd54" (UID: "e910960b-a434-4830-b4be-96571fa4dd54"). InnerVolumeSpecName "kube-api-access-4n27k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:00 crc kubenswrapper[4744]: I0311 01:20:00.993907 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553200-7v85t"] Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.008485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e910960b-a434-4830-b4be-96571fa4dd54" (UID: "e910960b-a434-4830-b4be-96571fa4dd54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.065402 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.065447 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n27k\" (UniqueName: \"kubernetes.io/projected/e910960b-a434-4830-b4be-96571fa4dd54-kube-api-access-4n27k\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.065466 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e910960b-a434-4830-b4be-96571fa4dd54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.284365 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553200-7v85t" event={"ID":"c5c9f281-5118-4349-bd48-43287ffb8059","Type":"ContainerStarted","Data":"37fa35ce862ac0333998240a0964a86c67d97222054bd197c3ea1b96750d686c"} Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.288094 4744 generic.go:334] "Generic (PLEG): container finished" podID="e910960b-a434-4830-b4be-96571fa4dd54" containerID="efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb" exitCode=0 Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.288160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerDied","Data":"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb"} Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.288203 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw9fv" event={"ID":"e910960b-a434-4830-b4be-96571fa4dd54","Type":"ContainerDied","Data":"b9b29139c92bbef3351895f00577cbdfe29d8ac1a349dba92cd6e5c004a53080"} Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.288233 4744 scope.go:117] "RemoveContainer" containerID="efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.288378 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw9fv" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.348822 4744 scope.go:117] "RemoveContainer" containerID="ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.348994 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.358619 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw9fv"] Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.379183 4744 scope.go:117] "RemoveContainer" containerID="c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.413036 4744 scope.go:117] "RemoveContainer" containerID="efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb" Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.414442 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb\": container with ID starting with efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb not found: ID does not exist" containerID="efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.414504 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb"} err="failed to get container status \"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb\": rpc error: code = NotFound desc = could not find container \"efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb\": container with ID starting with efa0f7e22d4db8192c397671ab3de9108246f3496641eb434b36d2d1b72c1dfb not found: ID does not exist" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.414575 4744 scope.go:117] "RemoveContainer" containerID="ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b" Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.415475 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b\": container with ID starting with ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b not found: ID does not exist" containerID="ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.415578 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b"} err="failed to get container status \"ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b\": rpc error: code = NotFound desc = could not find container \"ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b\": container with ID starting with ade6821c0859f0f387be46b25964041ce990fd43c109b143af360cbbc7e5576b not found: ID does not exist" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.415617 4744 scope.go:117] "RemoveContainer" containerID="c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274" Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.416354 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274\": container with ID starting with c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274 not found: ID does not exist" containerID="c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.416395 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274"} err="failed to get container status \"c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274\": rpc error: code = NotFound desc = could not find container \"c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274\": container with ID starting with c5f596f841b9f26617f30aeb9ccbde51b078a88e906c112d087c0809704f0274 not found: ID does not exist" Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.972192 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.973824 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.974017 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.975211 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.975264 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.980253 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.985209 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:01 crc kubenswrapper[4744]: E0311 01:20:01.985254 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:20:01 crc kubenswrapper[4744]: I0311 01:20:01.992287 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e910960b-a434-4830-b4be-96571fa4dd54" path="/var/lib/kubelet/pods/e910960b-a434-4830-b4be-96571fa4dd54/volumes" Mar 11 01:20:03 crc kubenswrapper[4744]: I0311 01:20:03.315026 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5c9f281-5118-4349-bd48-43287ffb8059" containerID="ff841945ddeea975698cf30b38f5d17e3d5bfd015350bd4a068055055d217192" exitCode=0 Mar 11 01:20:03 crc kubenswrapper[4744]: I0311 01:20:03.315278 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553200-7v85t" event={"ID":"c5c9f281-5118-4349-bd48-43287ffb8059","Type":"ContainerDied","Data":"ff841945ddeea975698cf30b38f5d17e3d5bfd015350bd4a068055055d217192"} Mar 11 01:20:04 crc kubenswrapper[4744]: I0311 01:20:04.856187 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.039567 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s47vz\" (UniqueName: \"kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz\") pod \"c5c9f281-5118-4349-bd48-43287ffb8059\" (UID: \"c5c9f281-5118-4349-bd48-43287ffb8059\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.056748 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz" (OuterVolumeSpecName: "kube-api-access-s47vz") pod "c5c9f281-5118-4349-bd48-43287ffb8059" (UID: "c5c9f281-5118-4349-bd48-43287ffb8059"). InnerVolumeSpecName "kube-api-access-s47vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.141872 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s47vz\" (UniqueName: \"kubernetes.io/projected/c5c9f281-5118-4349-bd48-43287ffb8059-kube-api-access-s47vz\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.337967 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553200-7v85t" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.337974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553200-7v85t" event={"ID":"c5c9f281-5118-4349-bd48-43287ffb8059","Type":"ContainerDied","Data":"37fa35ce862ac0333998240a0964a86c67d97222054bd197c3ea1b96750d686c"} Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.338014 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fa35ce862ac0333998240a0964a86c67d97222054bd197c3ea1b96750d686c" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.340248 4744 generic.go:334] "Generic (PLEG): container finished" podID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerID="b1b1e7e9a3f9e195c5c8ffc0f9ba222b2dd152ff67cad9587e2f46e9f7c8f240" exitCode=0 Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.340288 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerDied","Data":"b1b1e7e9a3f9e195c5c8ffc0f9ba222b2dd152ff67cad9587e2f46e9f7c8f240"} Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.340312 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5959cf6645-bcjjf" event={"ID":"cb4eb051-94b3-42d1-87ff-669ad8251b4f","Type":"ContainerDied","Data":"f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5"} Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.340326 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9918c070f6e96f38876d180b01dc9c8faada162d9c533b05275555037db32b5" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.348469 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tjbr\" (UniqueName: \"kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547588 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547676 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547710 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.547752 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs\") pod \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\" (UID: \"cb4eb051-94b3-42d1-87ff-669ad8251b4f\") " Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.552865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.553926 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr" (OuterVolumeSpecName: "kube-api-access-4tjbr") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "kube-api-access-4tjbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.592654 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.601493 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.619224 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.624865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config" (OuterVolumeSpecName: "config") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.649777 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.650058 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.650182 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.650296 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.650412 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.650578 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tjbr\" (UniqueName: \"kubernetes.io/projected/cb4eb051-94b3-42d1-87ff-669ad8251b4f-kube-api-access-4tjbr\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.670954 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cb4eb051-94b3-42d1-87ff-669ad8251b4f" (UID: "cb4eb051-94b3-42d1-87ff-669ad8251b4f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.752170 4744 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb4eb051-94b3-42d1-87ff-669ad8251b4f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.940070 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553194-5pbl5"] Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.949740 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553194-5pbl5"] Mar 11 01:20:05 crc kubenswrapper[4744]: I0311 01:20:05.991554 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754f2936-6e2b-47b4-85c8-3c0e9db82b51" path="/var/lib/kubelet/pods/754f2936-6e2b-47b4-85c8-3c0e9db82b51/volumes" Mar 11 01:20:06 crc kubenswrapper[4744]: I0311 01:20:06.352441 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5959cf6645-bcjjf" Mar 11 01:20:06 crc kubenswrapper[4744]: I0311 01:20:06.392083 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:20:06 crc kubenswrapper[4744]: I0311 01:20:06.401596 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5959cf6645-bcjjf"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.972131 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.972845 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.973342 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.973443 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.974414 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.976736 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.980255 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:06 crc kubenswrapper[4744]: E0311 01:20:06.980337 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:20:07 crc kubenswrapper[4744]: I0311 01:20:07.992694 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" path="/var/lib/kubelet/pods/cb4eb051-94b3-42d1-87ff-669ad8251b4f/volumes" Mar 11 01:20:08 crc kubenswrapper[4744]: I0311 01:20:08.060416 4744 scope.go:117] "RemoveContainer" containerID="5411455bceeecea05cde5ae8ff02a93d3a0c41fb95c9dd82603494c2dfeb96a3" Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.972448 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.973670 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.974081 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.974402 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.974478 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.976398 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.980777 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:11 crc kubenswrapper[4744]: E0311 01:20:11.980912 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:20:12 crc kubenswrapper[4744]: I0311 01:20:12.409607 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:20:12 crc kubenswrapper[4744]: I0311 01:20:12.410112 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:20:12 crc kubenswrapper[4744]: I0311 01:20:12.410184 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:20:12 crc kubenswrapper[4744]: I0311 01:20:12.411248 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:20:12 crc kubenswrapper[4744]: I0311 01:20:12.411383 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" gracePeriod=600 Mar 11 01:20:12 crc kubenswrapper[4744]: E0311 01:20:12.547250 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:20:13 crc kubenswrapper[4744]: I0311 01:20:13.448731 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" exitCode=0 Mar 11 01:20:13 crc kubenswrapper[4744]: I0311 01:20:13.448819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b"} Mar 11 01:20:13 crc kubenswrapper[4744]: I0311 01:20:13.449300 4744 scope.go:117] "RemoveContainer" containerID="cf996016ced3f16e6107f678cce67e4c982c8fa30c807453262d53b1c072f436" Mar 11 01:20:13 crc kubenswrapper[4744]: I0311 01:20:13.450113 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:20:13 crc kubenswrapper[4744]: E0311 01:20:13.450486 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.972989 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.973942 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.975880 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.976010 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.979040 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.981103 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.986095 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 01:20:16 crc kubenswrapper[4744]: E0311 01:20:16.986207 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-88ffp" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.063203 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-88ffp_1ee48ea6-67ea-4da3-af92-82b9d0e5b67d/ovs-vswitchd/0.log" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.065287 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267580 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtmgl\" (UniqueName: \"kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267697 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267768 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267825 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.267880 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib\") pod \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\" (UID: \"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d\") " Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.268210 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log" (OuterVolumeSpecName: "var-log") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.268277 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run" (OuterVolumeSpecName: "var-run") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.268294 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib" (OuterVolumeSpecName: "var-lib") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.268497 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.269101 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts" (OuterVolumeSpecName: "scripts") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.272464 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl" (OuterVolumeSpecName: "kube-api-access-wtmgl") pod "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" (UID: "1ee48ea6-67ea-4da3-af92-82b9d0e5b67d"). InnerVolumeSpecName "kube-api-access-wtmgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369561 4744 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369617 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369635 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-var-lib\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369655 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtmgl\" (UniqueName: \"kubernetes.io/projected/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-kube-api-access-wtmgl\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369675 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.369693 4744 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.513791 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-88ffp_1ee48ea6-67ea-4da3-af92-82b9d0e5b67d/ovs-vswitchd/0.log" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.515121 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" exitCode=137 Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.515223 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerDied","Data":"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144"} Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.515280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-88ffp" event={"ID":"1ee48ea6-67ea-4da3-af92-82b9d0e5b67d","Type":"ContainerDied","Data":"b4de8d22e0b094669f0815f7f5bada4bfca5e329c693edd03a9580de0e2c958e"} Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.515329 4744 scope.go:117] "RemoveContainer" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.515563 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-88ffp" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.530169 4744 generic.go:334] "Generic (PLEG): container finished" podID="524fac10-b874-465e-b4aa-221b6c689959" containerID="25e3543d3b3d14862a73ba6ffdaeec7e7e8cb26ca742becd4587bae22c3b8432" exitCode=137 Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.530225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"25e3543d3b3d14862a73ba6ffdaeec7e7e8cb26ca742becd4587bae22c3b8432"} Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.579991 4744 scope.go:117] "RemoveContainer" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.580795 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.596930 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-88ffp"] Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.604987 4744 scope.go:117] "RemoveContainer" containerID="aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.655545 4744 scope.go:117] "RemoveContainer" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" Mar 11 01:20:19 crc kubenswrapper[4744]: E0311 01:20:19.656260 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144\": container with ID starting with acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144 not found: ID does not exist" containerID="acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.656323 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144"} err="failed to get container status \"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144\": rpc error: code = NotFound desc = could not find container \"acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144\": container with ID starting with acfad524609d9f297101d39eb6bbc89088706371a3d119c57c6f3e38150b4144 not found: ID does not exist" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.656357 4744 scope.go:117] "RemoveContainer" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" Mar 11 01:20:19 crc kubenswrapper[4744]: E0311 01:20:19.657299 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05\": container with ID starting with 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 not found: ID does not exist" containerID="376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.657343 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05"} err="failed to get container status \"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05\": rpc error: code = NotFound desc = could not find container \"376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05\": container with ID starting with 376ac00d3a5176903cb096caf4aa6fb34bc54b52bfdfcffda0f5db681e50ec05 not found: ID does not exist" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.657371 4744 scope.go:117] "RemoveContainer" containerID="aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6" Mar 11 01:20:19 crc kubenswrapper[4744]: E0311 01:20:19.658127 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6\": container with ID starting with aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6 not found: ID does not exist" containerID="aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.658158 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6"} err="failed to get container status \"aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6\": rpc error: code = NotFound desc = could not find container \"aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6\": container with ID starting with aaa497f2031029fd775e54985005baafae8116b7da7e04e50aa695b21c4f58a6 not found: ID does not exist" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.974608 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 01:20:19 crc kubenswrapper[4744]: I0311 01:20:19.995963 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" path="/var/lib/kubelet/pods/1ee48ea6-67ea-4da3-af92-82b9d0e5b67d/volumes" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080189 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080316 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080395 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64l2g\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080437 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080523 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache\") pod \"524fac10-b874-465e-b4aa-221b6c689959\" (UID: \"524fac10-b874-465e-b4aa-221b6c689959\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.080876 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock" (OuterVolumeSpecName: "lock") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.081310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache" (OuterVolumeSpecName: "cache") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.083920 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.094629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g" (OuterVolumeSpecName: "kube-api-access-64l2g") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "kube-api-access-64l2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.097332 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.155140 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.181895 4744 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-cache\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.181931 4744 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.181944 4744 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/524fac10-b874-465e-b4aa-221b6c689959-lock\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.181983 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64l2g\" (UniqueName: \"kubernetes.io/projected/524fac10-b874-465e-b4aa-221b6c689959-kube-api-access-64l2g\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.182011 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.201269 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.282907 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.282971 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.283045 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x82h\" (UniqueName: \"kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.283077 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.283135 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.283180 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom\") pod \"a336a32d-e322-4261-8a29-ce0f30435d83\" (UID: \"a336a32d-e322-4261-8a29-ce0f30435d83\") " Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.283461 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.284087 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.286830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h" (OuterVolumeSpecName: "kube-api-access-5x82h") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "kube-api-access-5x82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.287171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts" (OuterVolumeSpecName: "scripts") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.287926 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.328113 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.352791 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "524fac10-b874-465e-b4aa-221b6c689959" (UID: "524fac10-b874-465e-b4aa-221b6c689959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.371364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data" (OuterVolumeSpecName: "config-data") pod "a336a32d-e322-4261-8a29-ce0f30435d83" (UID: "a336a32d-e322-4261-8a29-ce0f30435d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385269 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385300 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a336a32d-e322-4261-8a29-ce0f30435d83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385333 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x82h\" (UniqueName: \"kubernetes.io/projected/a336a32d-e322-4261-8a29-ce0f30435d83-kube-api-access-5x82h\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385348 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385360 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524fac10-b874-465e-b4aa-221b6c689959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385372 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.385384 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a336a32d-e322-4261-8a29-ce0f30435d83-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.438088 4744 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podebd1c76c-75f8-411f-9350-a0e31f1721cd"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podebd1c76c-75f8-411f-9350-a0e31f1721cd] : Timed out while waiting for systemd to remove kubepods-besteffort-podebd1c76c_75f8_411f_9350_a0e31f1721cd.slice" Mar 11 01:20:20 crc kubenswrapper[4744]: E0311 01:20:20.438476 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podebd1c76c-75f8-411f-9350-a0e31f1721cd] : unable to destroy cgroup paths for cgroup [kubepods besteffort podebd1c76c-75f8-411f-9350-a0e31f1721cd] : Timed out while waiting for systemd to remove kubepods-besteffort-podebd1c76c_75f8_411f_9350_a0e31f1721cd.slice" pod="openstack/ovn-controller-metrics-nrcjs" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.539560 4744 generic.go:334] "Generic (PLEG): container finished" podID="a336a32d-e322-4261-8a29-ce0f30435d83" containerID="e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055" exitCode=137 Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.539612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerDied","Data":"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055"} Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.539638 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a336a32d-e322-4261-8a29-ce0f30435d83","Type":"ContainerDied","Data":"bcececd1efdbb237ee0d6c1f37f9d3dbe0f983db79fd2f2dcbd36e9ec1bc9cce"} Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.539652 4744 scope.go:117] "RemoveContainer" containerID="8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.539762 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.575081 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.587875 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.588024 4744 scope.go:117] "RemoveContainer" containerID="e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.613103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"524fac10-b874-465e-b4aa-221b6c689959","Type":"ContainerDied","Data":"9837290bf3ac08505bb72377e073fa947a1426eeec53c677c31fef53c43ad429"} Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.613265 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.618580 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrcjs" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.640839 4744 scope.go:117] "RemoveContainer" containerID="8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44" Mar 11 01:20:20 crc kubenswrapper[4744]: E0311 01:20:20.641656 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44\": container with ID starting with 8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44 not found: ID does not exist" containerID="8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.641741 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44"} err="failed to get container status \"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44\": rpc error: code = NotFound desc = could not find container \"8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44\": container with ID starting with 8a970e26c9b1c228eccbd5179f834fcbf2574e2fb70d018dcd7783d99b6ddc44 not found: ID does not exist" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.641804 4744 scope.go:117] "RemoveContainer" containerID="e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055" Mar 11 01:20:20 crc kubenswrapper[4744]: E0311 01:20:20.647118 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055\": container with ID starting with e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055 not found: ID does not exist" containerID="e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.647201 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055"} err="failed to get container status \"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055\": rpc error: code = NotFound desc = could not find container \"e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055\": container with ID starting with e407aadc6fa1cd45a9aa2b8ef3203e0c6ec01e80327e2c3206a5ebcd6d919055 not found: ID does not exist" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.647228 4744 scope.go:117] "RemoveContainer" containerID="25e3543d3b3d14862a73ba6ffdaeec7e7e8cb26ca742becd4587bae22c3b8432" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.668046 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.674797 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-nrcjs"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.689021 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.694387 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.699260 4744 scope.go:117] "RemoveContainer" containerID="5bbb97d3f04c59bd0734d58c134b140d749bfb617a14047f4d2be2a03bcf9bd5" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.715139 4744 scope.go:117] "RemoveContainer" containerID="57f15e76688f17356669b85c46315dd0dffc814d43f7f2c52af90d0784301949" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.733555 4744 scope.go:117] "RemoveContainer" containerID="683071c11cad26f904a5f7a36fcb33b138ae8c1681f0a7ff0d1c59caa91adfa5" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.749823 4744 scope.go:117] "RemoveContainer" containerID="7325adf5421afd7c3a21ffc84f42f4176496bf0df0f8bbb48940ea537474b52d" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.765405 4744 scope.go:117] "RemoveContainer" containerID="1dc3103a150112ffa802284194bdc5ad25c73127fe6b5ddb013e5409a1028b69" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.782323 4744 scope.go:117] "RemoveContainer" containerID="98bb403c28de1a3a422f752fd836eeaf91ab8123e3a1415917dfa6a935d3dad7" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.798309 4744 scope.go:117] "RemoveContainer" containerID="7d4d68ba9b886d9742d463d33fa1cc87d5cbb6630ca1df77c63ccccdfc56e184" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.812918 4744 scope.go:117] "RemoveContainer" containerID="2a2081a399521c0c43979716a406e0e99df32e512b18971fd342f84cf6c0c784" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.828712 4744 scope.go:117] "RemoveContainer" containerID="2b4eef62494e9560a6468f7258be6cbee2afabc30627c4cd424f6256bf882bd9" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.847704 4744 scope.go:117] "RemoveContainer" containerID="d0f959b12a9512cb3f5a0776eb000de08da42f09e11825b7a95d1b85cdeb9533" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.865589 4744 scope.go:117] "RemoveContainer" containerID="657db772f7e222c18d1d14b7b5c9643c0ec7e79c4adc1d26309d817f501de327" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.879996 4744 scope.go:117] "RemoveContainer" containerID="0520adffd7bb3fd0a12c9f2003a6119d39241234f65942bbd93b143144e91dff" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.904290 4744 scope.go:117] "RemoveContainer" containerID="661223f28b0201765a8971851fdcdfc8ce86ba64df01908f6086c416839db484" Mar 11 01:20:20 crc kubenswrapper[4744]: I0311 01:20:20.929103 4744 scope.go:117] "RemoveContainer" containerID="b78b35fad65a6104f3b70ff15a556ecab18834b6ed00582485d71f455ffb4854" Mar 11 01:20:21 crc kubenswrapper[4744]: I0311 01:20:21.993681 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524fac10-b874-465e-b4aa-221b6c689959" path="/var/lib/kubelet/pods/524fac10-b874-465e-b4aa-221b6c689959/volumes" Mar 11 01:20:21 crc kubenswrapper[4744]: I0311 01:20:21.997821 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" path="/var/lib/kubelet/pods/a336a32d-e322-4261-8a29-ce0f30435d83/volumes" Mar 11 01:20:21 crc kubenswrapper[4744]: I0311 01:20:21.998986 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd1c76c-75f8-411f-9350-a0e31f1721cd" path="/var/lib/kubelet/pods/ebd1c76c-75f8-411f-9350-a0e31f1721cd/volumes" Mar 11 01:20:27 crc kubenswrapper[4744]: I0311 01:20:27.975942 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:20:27 crc kubenswrapper[4744]: E0311 01:20:27.977139 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:20:40 crc kubenswrapper[4744]: I0311 01:20:40.976729 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:20:40 crc kubenswrapper[4744]: E0311 01:20:40.977702 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:20:51 crc kubenswrapper[4744]: I0311 01:20:51.975790 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:20:51 crc kubenswrapper[4744]: E0311 01:20:51.976731 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:21:06 crc kubenswrapper[4744]: I0311 01:21:06.974786 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:21:06 crc kubenswrapper[4744]: E0311 01:21:06.975544 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:21:08 crc kubenswrapper[4744]: I0311 01:21:08.914075 4744 scope.go:117] "RemoveContainer" containerID="063a7c95da290d544b7923701cac3b4e437141eacc1fff4c7a6f62b0e75c7ba4" Mar 11 01:21:08 crc kubenswrapper[4744]: I0311 01:21:08.958262 4744 scope.go:117] "RemoveContainer" containerID="d90c7a6c9305d356e7403d291e2f081e429e4966cb7e9f20d9b471cf46eddf16" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.002332 4744 scope.go:117] "RemoveContainer" containerID="06a3af48518031c156f51eb5607cfcb96d799cb1d1c4bc03f941155347f77f71" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.031503 4744 scope.go:117] "RemoveContainer" containerID="d4e5360567689983645dc7f299ef53ab8dbb200c7bcb992f0ff7fcc3a8bffd18" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.066121 4744 scope.go:117] "RemoveContainer" containerID="eec6090a40721da764b37de1f739da92e8819f033338f4cf2e5731f03131daae" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.095976 4744 scope.go:117] "RemoveContainer" containerID="6277045b67820d0a1857d93e6e8e3ca7197495ee0c3cd0064b91a9e4d115b55b" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.130030 4744 scope.go:117] "RemoveContainer" containerID="1aa521d4cbd225ff73bcece76cd081aee159ae63e3b4b1ec498f897e70556651" Mar 11 01:21:09 crc kubenswrapper[4744]: I0311 01:21:09.165892 4744 scope.go:117] "RemoveContainer" containerID="3b99519301672447b89814ce6fee64db32f0b7eb950348bbc647f1820b678ac0" Mar 11 01:21:19 crc kubenswrapper[4744]: I0311 01:21:19.975016 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:21:19 crc kubenswrapper[4744]: E0311 01:21:19.976059 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:21:31 crc kubenswrapper[4744]: I0311 01:21:31.975632 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:21:31 crc kubenswrapper[4744]: E0311 01:21:31.976791 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:21:45 crc kubenswrapper[4744]: I0311 01:21:45.975254 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:21:45 crc kubenswrapper[4744]: E0311 01:21:45.978566 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:21:57 crc kubenswrapper[4744]: I0311 01:21:57.975054 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:21:57 crc kubenswrapper[4744]: E0311 01:21:57.976142 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.166976 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553202-cqp9j"] Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168349 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168397 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168431 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168447 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-server" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168475 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="probe" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168494 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="probe" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168560 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168579 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168608 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="swift-recon-cron" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168624 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="swift-recon-cron" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168657 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-api" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168690 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-api" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168720 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-httpd" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168736 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-httpd" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168759 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168776 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168812 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server-init" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168829 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server-init" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168855 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-reaper" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168871 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-reaper" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168900 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168917 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168950 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.168966 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.168993 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169009 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169043 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169059 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169087 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c9f281-5118-4349-bd48-43287ffb8059" containerName="oc" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169104 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c9f281-5118-4349-bd48-43287ffb8059" containerName="oc" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169133 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169149 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-server" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169174 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169190 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-server" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169208 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="extract-content" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169226 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="extract-content" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169255 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="extract-utilities" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169272 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="extract-utilities" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169293 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="registry-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169310 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="registry-server" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169338 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169356 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169385 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169400 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169433 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="rsync" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169449 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="rsync" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.169476 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.169492 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.171324 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="cinder-scheduler" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171354 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="cinder-scheduler" Mar 11 01:22:00 crc kubenswrapper[4744]: E0311 01:22:00.171385 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-expirer" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171403 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-expirer" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171816 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171855 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-httpd" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171880 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171898 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4eb051-94b3-42d1-87ff-669ad8251b4f" containerName="neutron-api" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171928 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovsdb-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.171970 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="probe" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172003 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172063 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e910960b-a434-4830-b4be-96571fa4dd54" containerName="registry-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172082 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="swift-recon-cron" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172099 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-reaper" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172119 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172142 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172162 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-updater" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172182 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="container-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172207 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-auditor" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172253 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="rsync" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172274 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-expirer" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172292 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172319 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c9f281-5118-4349-bd48-43287ffb8059" containerName="oc" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172339 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="object-server" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172362 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee48ea6-67ea-4da3-af92-82b9d0e5b67d" containerName="ovs-vswitchd" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172382 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a336a32d-e322-4261-8a29-ce0f30435d83" containerName="cinder-scheduler" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.172410 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fac10-b874-465e-b4aa-221b6c689959" containerName="account-replicator" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.173427 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.177155 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.177620 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.177866 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.182624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553202-cqp9j"] Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.202631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5nnr\" (UniqueName: \"kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr\") pod \"auto-csr-approver-29553202-cqp9j\" (UID: \"c0d271c5-b68d-4c0f-a525-d96c6b0053a3\") " pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.304499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5nnr\" (UniqueName: \"kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr\") pod \"auto-csr-approver-29553202-cqp9j\" (UID: \"c0d271c5-b68d-4c0f-a525-d96c6b0053a3\") " pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.341353 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5nnr\" (UniqueName: \"kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr\") pod \"auto-csr-approver-29553202-cqp9j\" (UID: \"c0d271c5-b68d-4c0f-a525-d96c6b0053a3\") " pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.503739 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:00 crc kubenswrapper[4744]: I0311 01:22:00.991471 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553202-cqp9j"] Mar 11 01:22:01 crc kubenswrapper[4744]: I0311 01:22:01.690262 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" event={"ID":"c0d271c5-b68d-4c0f-a525-d96c6b0053a3","Type":"ContainerStarted","Data":"9bf65a1d9e8d415980fc65177124dbc2b0b04f55dcc335e9c99ea3be328216b5"} Mar 11 01:22:02 crc kubenswrapper[4744]: I0311 01:22:02.702983 4744 generic.go:334] "Generic (PLEG): container finished" podID="c0d271c5-b68d-4c0f-a525-d96c6b0053a3" containerID="739127757cd10c744d4c6dfba47db228b6f0026224f80cb3c0973501e5d10f1b" exitCode=0 Mar 11 01:22:02 crc kubenswrapper[4744]: I0311 01:22:02.703066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" event={"ID":"c0d271c5-b68d-4c0f-a525-d96c6b0053a3","Type":"ContainerDied","Data":"739127757cd10c744d4c6dfba47db228b6f0026224f80cb3c0973501e5d10f1b"} Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.052329 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.067013 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5nnr\" (UniqueName: \"kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr\") pod \"c0d271c5-b68d-4c0f-a525-d96c6b0053a3\" (UID: \"c0d271c5-b68d-4c0f-a525-d96c6b0053a3\") " Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.074369 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr" (OuterVolumeSpecName: "kube-api-access-x5nnr") pod "c0d271c5-b68d-4c0f-a525-d96c6b0053a3" (UID: "c0d271c5-b68d-4c0f-a525-d96c6b0053a3"). InnerVolumeSpecName "kube-api-access-x5nnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.169856 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5nnr\" (UniqueName: \"kubernetes.io/projected/c0d271c5-b68d-4c0f-a525-d96c6b0053a3-kube-api-access-x5nnr\") on node \"crc\" DevicePath \"\"" Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.752568 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" event={"ID":"c0d271c5-b68d-4c0f-a525-d96c6b0053a3","Type":"ContainerDied","Data":"9bf65a1d9e8d415980fc65177124dbc2b0b04f55dcc335e9c99ea3be328216b5"} Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.753027 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bf65a1d9e8d415980fc65177124dbc2b0b04f55dcc335e9c99ea3be328216b5" Mar 11 01:22:04 crc kubenswrapper[4744]: I0311 01:22:04.752668 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553202-cqp9j" Mar 11 01:22:05 crc kubenswrapper[4744]: I0311 01:22:05.152900 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553196-9jct5"] Mar 11 01:22:05 crc kubenswrapper[4744]: I0311 01:22:05.160709 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553196-9jct5"] Mar 11 01:22:05 crc kubenswrapper[4744]: I0311 01:22:05.983317 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddd91ff-2bab-458e-b371-13bb59892f28" path="/var/lib/kubelet/pods/0ddd91ff-2bab-458e-b371-13bb59892f28/volumes" Mar 11 01:22:08 crc kubenswrapper[4744]: I0311 01:22:08.975114 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:22:08 crc kubenswrapper[4744]: E0311 01:22:08.975722 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.379949 4744 scope.go:117] "RemoveContainer" containerID="119366017deed13d5411b21ac7aa67584fa8eee5670f556b1938ccb084cb7a63" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.434154 4744 scope.go:117] "RemoveContainer" containerID="48d37240544df639a45e04fbd68879b9b5bfc51393609cb1b20a53fa838c61f3" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.476332 4744 scope.go:117] "RemoveContainer" containerID="d1726b9d6f03c03c356e9d6bb32bfd5fbcb952342b378d0b48932e5022673713" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.532855 4744 scope.go:117] "RemoveContainer" containerID="0c402c6747514bb3e0ca66a9c9737243dde53fd4f6cf51f17fc335fca19b4742" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.555724 4744 scope.go:117] "RemoveContainer" containerID="ea45ed7a7d4861a79b5a672dec79bdc1b140837ddb6e2ba942b9ea2973d75746" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.592661 4744 scope.go:117] "RemoveContainer" containerID="4fcc7b63cdb756d5d1faef2219f8ea9445e24b55a6b5b6e5dfb86462daed1a62" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.612268 4744 scope.go:117] "RemoveContainer" containerID="d9a6597b777989fed50af9702b46118834e4a564467d8179969dc4edc482ae74" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.643091 4744 scope.go:117] "RemoveContainer" containerID="e2ed15dda88956f35330c2f6ffc04931080e6522b2c86eb53ef084a7af41baec" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.659170 4744 scope.go:117] "RemoveContainer" containerID="e3b8eae5e6b10a6863aaecbd738bf9070b2142a94b79196c12273db23b6c7636" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.679287 4744 scope.go:117] "RemoveContainer" containerID="8264dcc7b47b99d9e0f3e4cd1ce2dce69f6b1a59b9f43863fc33c7b8c9be71ff" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.700538 4744 scope.go:117] "RemoveContainer" containerID="82fa8eb24907701acba9eb6c5271ab16d4109170acbe1d9ecb398258c67986fb" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.751392 4744 scope.go:117] "RemoveContainer" containerID="6599bceff68aad073272a774bd2d861c55ba5068a8f95a4f9574b99febed168a" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.779428 4744 scope.go:117] "RemoveContainer" containerID="29c5e48552c08092a3b4ed7d23f0734524ae1b5ed4a7bab0043222c4c3afdeda" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.814813 4744 scope.go:117] "RemoveContainer" containerID="0db3f9268dd586840a750ce3aab8eee17063d00b89aeb58328fea2ea809f3bd6" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.836883 4744 scope.go:117] "RemoveContainer" containerID="fb286387721bec0a1f91359d2c30c79ca49f4b4898995988b7e9e90f9a29caf6" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.874692 4744 scope.go:117] "RemoveContainer" containerID="fee28609ee0ea1a337fd4c5d5e10b1778640610d9c12fde1c9fb6d378bc3dee1" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.907813 4744 scope.go:117] "RemoveContainer" containerID="53dabf5222da9591bfd90f74338a980cda9124c6f3b99ab148f16d2ce8f3d265" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.941030 4744 scope.go:117] "RemoveContainer" containerID="a21a680118862f3c8728a36beb870d693dfd1b445663c3f01d10b1a3ddde25ad" Mar 11 01:22:09 crc kubenswrapper[4744]: I0311 01:22:09.967808 4744 scope.go:117] "RemoveContainer" containerID="a44ee64c6826fd9da076bf430593b7dd5a4963b74804af985f86a5ed00395e74" Mar 11 01:22:23 crc kubenswrapper[4744]: I0311 01:22:23.976369 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:22:23 crc kubenswrapper[4744]: E0311 01:22:23.978018 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:22:36 crc kubenswrapper[4744]: I0311 01:22:36.975398 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:22:36 crc kubenswrapper[4744]: E0311 01:22:36.976507 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:22:50 crc kubenswrapper[4744]: I0311 01:22:50.974933 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:22:50 crc kubenswrapper[4744]: E0311 01:22:50.975986 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.290873 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:22:51 crc kubenswrapper[4744]: E0311 01:22:51.291391 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d271c5-b68d-4c0f-a525-d96c6b0053a3" containerName="oc" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.291422 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d271c5-b68d-4c0f-a525-d96c6b0053a3" containerName="oc" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.291663 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d271c5-b68d-4c0f-a525-d96c6b0053a3" containerName="oc" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.293023 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.303685 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.461671 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.461772 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.461819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qnb\" (UniqueName: \"kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.562712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.562802 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.562836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qnb\" (UniqueName: \"kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.563806 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.564125 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.584659 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qnb\" (UniqueName: \"kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb\") pod \"community-operators-94pmr\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:51 crc kubenswrapper[4744]: I0311 01:22:51.631437 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:22:52 crc kubenswrapper[4744]: I0311 01:22:52.220763 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:22:52 crc kubenswrapper[4744]: I0311 01:22:52.254885 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerStarted","Data":"486af0ccf36d196ff16e6ea9a125cfa03736f438b89afb5a96949cf2c393b8e2"} Mar 11 01:22:53 crc kubenswrapper[4744]: I0311 01:22:53.296096 4744 generic.go:334] "Generic (PLEG): container finished" podID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerID="857cba965aca177c7ed06becfe395d1b1f8da09a4aec5319e30974bcef4107ca" exitCode=0 Mar 11 01:22:53 crc kubenswrapper[4744]: I0311 01:22:53.296148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerDied","Data":"857cba965aca177c7ed06becfe395d1b1f8da09a4aec5319e30974bcef4107ca"} Mar 11 01:22:54 crc kubenswrapper[4744]: I0311 01:22:54.312121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerStarted","Data":"b4ccec1c829abcdecd5b43ef1c33078cd09c6a076a22cfa515c421edd2cd4e68"} Mar 11 01:22:55 crc kubenswrapper[4744]: I0311 01:22:55.324862 4744 generic.go:334] "Generic (PLEG): container finished" podID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerID="b4ccec1c829abcdecd5b43ef1c33078cd09c6a076a22cfa515c421edd2cd4e68" exitCode=0 Mar 11 01:22:55 crc kubenswrapper[4744]: I0311 01:22:55.325008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerDied","Data":"b4ccec1c829abcdecd5b43ef1c33078cd09c6a076a22cfa515c421edd2cd4e68"} Mar 11 01:22:56 crc kubenswrapper[4744]: I0311 01:22:56.339132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerStarted","Data":"f167ea9f9344976458fb86394434d9a08411bbd0ec265e04683d4415e373061f"} Mar 11 01:22:56 crc kubenswrapper[4744]: I0311 01:22:56.375185 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94pmr" podStartSLOduration=2.926084751 podStartE2EDuration="5.375148689s" podCreationTimestamp="2026-03-11 01:22:51 +0000 UTC" firstStartedPulling="2026-03-11 01:22:53.304665582 +0000 UTC m=+1730.108883187" lastFinishedPulling="2026-03-11 01:22:55.75372952 +0000 UTC m=+1732.557947125" observedRunningTime="2026-03-11 01:22:56.37223856 +0000 UTC m=+1733.176456195" watchObservedRunningTime="2026-03-11 01:22:56.375148689 +0000 UTC m=+1733.179366334" Mar 11 01:23:01 crc kubenswrapper[4744]: I0311 01:23:01.631991 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:01 crc kubenswrapper[4744]: I0311 01:23:01.632813 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:01 crc kubenswrapper[4744]: I0311 01:23:01.710929 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:02 crc kubenswrapper[4744]: I0311 01:23:02.459493 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:02 crc kubenswrapper[4744]: I0311 01:23:02.540335 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:23:04 crc kubenswrapper[4744]: I0311 01:23:04.419304 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94pmr" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="registry-server" containerID="cri-o://f167ea9f9344976458fb86394434d9a08411bbd0ec265e04683d4415e373061f" gracePeriod=2 Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.428819 4744 generic.go:334] "Generic (PLEG): container finished" podID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerID="f167ea9f9344976458fb86394434d9a08411bbd0ec265e04683d4415e373061f" exitCode=0 Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.428992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerDied","Data":"f167ea9f9344976458fb86394434d9a08411bbd0ec265e04683d4415e373061f"} Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.582788 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.694988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content\") pod \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.695170 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities\") pod \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.696960 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities" (OuterVolumeSpecName: "utilities") pod "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" (UID: "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.695249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qnb\" (UniqueName: \"kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb\") pod \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\" (UID: \"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f\") " Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.697774 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.705091 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb" (OuterVolumeSpecName: "kube-api-access-m6qnb") pod "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" (UID: "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f"). InnerVolumeSpecName "kube-api-access-m6qnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.775586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" (UID: "7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.799487 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.799522 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6qnb\" (UniqueName: \"kubernetes.io/projected/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f-kube-api-access-m6qnb\") on node \"crc\" DevicePath \"\"" Mar 11 01:23:05 crc kubenswrapper[4744]: I0311 01:23:05.974736 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:23:05 crc kubenswrapper[4744]: E0311 01:23:05.975078 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.444332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pmr" event={"ID":"7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f","Type":"ContainerDied","Data":"486af0ccf36d196ff16e6ea9a125cfa03736f438b89afb5a96949cf2c393b8e2"} Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.444420 4744 scope.go:117] "RemoveContainer" containerID="f167ea9f9344976458fb86394434d9a08411bbd0ec265e04683d4415e373061f" Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.444433 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pmr" Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.479765 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.488339 4744 scope.go:117] "RemoveContainer" containerID="b4ccec1c829abcdecd5b43ef1c33078cd09c6a076a22cfa515c421edd2cd4e68" Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.491102 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94pmr"] Mar 11 01:23:06 crc kubenswrapper[4744]: I0311 01:23:06.526837 4744 scope.go:117] "RemoveContainer" containerID="857cba965aca177c7ed06becfe395d1b1f8da09a4aec5319e30974bcef4107ca" Mar 11 01:23:07 crc kubenswrapper[4744]: I0311 01:23:07.985737 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" path="/var/lib/kubelet/pods/7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f/volumes" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.361992 4744 scope.go:117] "RemoveContainer" containerID="66c68ed9f5fec4903698aade9ec4c3fbc9295c3abb2ec1eb2b9ad1cedcf3b073" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.405403 4744 scope.go:117] "RemoveContainer" containerID="0f9b7029580d55c5f783ab75363f97ad48322c0595bdb273cac92c6d226c63be" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.486056 4744 scope.go:117] "RemoveContainer" containerID="69482b178fafc05c024eb004a115beaab033032799b5cadbee23b74dfb43962a" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.518984 4744 scope.go:117] "RemoveContainer" containerID="b1b1e7e9a3f9e195c5c8ffc0f9ba222b2dd152ff67cad9587e2f46e9f7c8f240" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.554608 4744 scope.go:117] "RemoveContainer" containerID="f8b8f7b449a25530a4e56bdbe6a657c349409b4ba50248bbde453d067a3c546c" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.594315 4744 scope.go:117] "RemoveContainer" containerID="146ff6ba72638f407c5f03e5e7f7737ad544dcacd98a41c6f2d121f5c65294e6" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.620812 4744 scope.go:117] "RemoveContainer" containerID="90d7864d9f44afb54c45c8f83816565799a968b37e50716fa7d940d17e944838" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.659027 4744 scope.go:117] "RemoveContainer" containerID="9142db13bbc5f82956a74594d528d0085a37cf1bf44730e738c589a0b9fa71b1" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.681995 4744 scope.go:117] "RemoveContainer" containerID="0a9c241eccc912eedc93ca498ce49b3438428bba9d63396a61de516e8822d3fa" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.709774 4744 scope.go:117] "RemoveContainer" containerID="ebdc71e1a72435c8db752c01353cd2455830979afbd2fb878bea63ca113a125b" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.735374 4744 scope.go:117] "RemoveContainer" containerID="8cb2fb1eaf36c371a675c0619e8832d79d08d8d675c411d59203012135eb70ca" Mar 11 01:23:10 crc kubenswrapper[4744]: I0311 01:23:10.766069 4744 scope.go:117] "RemoveContainer" containerID="eb1793ab7701df1a2087e9693648ed7fc1c77b3aadb94a5a3e17509e9cd8767b" Mar 11 01:23:16 crc kubenswrapper[4744]: I0311 01:23:16.974469 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:23:16 crc kubenswrapper[4744]: E0311 01:23:16.975429 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:23:29 crc kubenswrapper[4744]: I0311 01:23:29.975225 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:23:29 crc kubenswrapper[4744]: E0311 01:23:29.976572 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:23:43 crc kubenswrapper[4744]: I0311 01:23:43.982646 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:23:43 crc kubenswrapper[4744]: E0311 01:23:43.983576 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:23:55 crc kubenswrapper[4744]: I0311 01:23:55.975008 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:23:55 crc kubenswrapper[4744]: E0311 01:23:55.976006 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.168336 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553204-l2rr2"] Mar 11 01:24:00 crc kubenswrapper[4744]: E0311 01:24:00.169172 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="extract-content" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.169199 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="extract-content" Mar 11 01:24:00 crc kubenswrapper[4744]: E0311 01:24:00.169224 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="registry-server" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.169236 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="registry-server" Mar 11 01:24:00 crc kubenswrapper[4744]: E0311 01:24:00.169258 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="extract-utilities" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.169271 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="extract-utilities" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.169560 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec5fd1b-0c2a-4783-b967-c57dfa73ba0f" containerName="registry-server" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.170474 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.174161 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.175505 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.178954 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.188903 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553204-l2rr2"] Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.281658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69npf\" (UniqueName: \"kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf\") pod \"auto-csr-approver-29553204-l2rr2\" (UID: \"3422d7c5-ba80-438a-a2e9-420fd3ba6030\") " pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.384285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69npf\" (UniqueName: \"kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf\") pod \"auto-csr-approver-29553204-l2rr2\" (UID: \"3422d7c5-ba80-438a-a2e9-420fd3ba6030\") " pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.413135 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69npf\" (UniqueName: \"kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf\") pod \"auto-csr-approver-29553204-l2rr2\" (UID: \"3422d7c5-ba80-438a-a2e9-420fd3ba6030\") " pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.509119 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:00 crc kubenswrapper[4744]: I0311 01:24:00.993106 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553204-l2rr2"] Mar 11 01:24:01 crc kubenswrapper[4744]: I0311 01:24:01.045661 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" event={"ID":"3422d7c5-ba80-438a-a2e9-420fd3ba6030","Type":"ContainerStarted","Data":"41cb47c95eb34f13987b5bf1b8345ef7a46d419c95d04daa2a7ad476e2df0bd5"} Mar 11 01:24:03 crc kubenswrapper[4744]: I0311 01:24:03.065416 4744 generic.go:334] "Generic (PLEG): container finished" podID="3422d7c5-ba80-438a-a2e9-420fd3ba6030" containerID="a4520cb01287ba761e907697dc32c7b92d688cd881fb12f92f86025c6d667156" exitCode=0 Mar 11 01:24:03 crc kubenswrapper[4744]: I0311 01:24:03.065607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" event={"ID":"3422d7c5-ba80-438a-a2e9-420fd3ba6030","Type":"ContainerDied","Data":"a4520cb01287ba761e907697dc32c7b92d688cd881fb12f92f86025c6d667156"} Mar 11 01:24:04 crc kubenswrapper[4744]: I0311 01:24:04.429539 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:04 crc kubenswrapper[4744]: I0311 01:24:04.554834 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69npf\" (UniqueName: \"kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf\") pod \"3422d7c5-ba80-438a-a2e9-420fd3ba6030\" (UID: \"3422d7c5-ba80-438a-a2e9-420fd3ba6030\") " Mar 11 01:24:04 crc kubenswrapper[4744]: I0311 01:24:04.562758 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf" (OuterVolumeSpecName: "kube-api-access-69npf") pod "3422d7c5-ba80-438a-a2e9-420fd3ba6030" (UID: "3422d7c5-ba80-438a-a2e9-420fd3ba6030"). InnerVolumeSpecName "kube-api-access-69npf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:24:04 crc kubenswrapper[4744]: I0311 01:24:04.657703 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69npf\" (UniqueName: \"kubernetes.io/projected/3422d7c5-ba80-438a-a2e9-420fd3ba6030-kube-api-access-69npf\") on node \"crc\" DevicePath \"\"" Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.087330 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" event={"ID":"3422d7c5-ba80-438a-a2e9-420fd3ba6030","Type":"ContainerDied","Data":"41cb47c95eb34f13987b5bf1b8345ef7a46d419c95d04daa2a7ad476e2df0bd5"} Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.087378 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553204-l2rr2" Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.087392 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41cb47c95eb34f13987b5bf1b8345ef7a46d419c95d04daa2a7ad476e2df0bd5" Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.516675 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553198-skc7b"] Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.523582 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553198-skc7b"] Mar 11 01:24:05 crc kubenswrapper[4744]: I0311 01:24:05.986641 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b550d6-9fea-497a-84d0-4f05d34f3013" path="/var/lib/kubelet/pods/29b550d6-9fea-497a-84d0-4f05d34f3013/volumes" Mar 11 01:24:07 crc kubenswrapper[4744]: I0311 01:24:07.974808 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:24:07 crc kubenswrapper[4744]: E0311 01:24:07.975491 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:24:10 crc kubenswrapper[4744]: I0311 01:24:10.952787 4744 scope.go:117] "RemoveContainer" containerID="4adb24658e3a9f0e113d48ab8079e1c019d2d01a9440a17a80fbe8fa0e414acf" Mar 11 01:24:11 crc kubenswrapper[4744]: I0311 01:24:11.006803 4744 scope.go:117] "RemoveContainer" containerID="5525f25b06e6ff4c2d8a48534b38c68ef8ef4cf848bcb4cccfe6dca7d1cdfb4f" Mar 11 01:24:11 crc kubenswrapper[4744]: I0311 01:24:11.059303 4744 scope.go:117] "RemoveContainer" containerID="5609d95b861d80c9c856a5d5ae721e3df23258d3dac31b4a8d829352cafed8e3" Mar 11 01:24:11 crc kubenswrapper[4744]: I0311 01:24:11.092710 4744 scope.go:117] "RemoveContainer" containerID="b3c2039b5d3c821ce56b152f8a4817de58be05051c1b9159d5f0d6fc2571c7eb" Mar 11 01:24:22 crc kubenswrapper[4744]: I0311 01:24:22.974445 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:24:22 crc kubenswrapper[4744]: E0311 01:24:22.975188 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:24:35 crc kubenswrapper[4744]: I0311 01:24:35.975826 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:24:35 crc kubenswrapper[4744]: E0311 01:24:35.976840 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:24:46 crc kubenswrapper[4744]: I0311 01:24:46.974591 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:24:46 crc kubenswrapper[4744]: E0311 01:24:46.975476 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:25:01 crc kubenswrapper[4744]: I0311 01:25:01.975389 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:25:01 crc kubenswrapper[4744]: E0311 01:25:01.976355 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:25:11 crc kubenswrapper[4744]: I0311 01:25:11.173164 4744 scope.go:117] "RemoveContainer" containerID="39f29b5976ac2aa60fcfaaddf31122cfd7fb5d5f0ee9a848396ad10963415872" Mar 11 01:25:16 crc kubenswrapper[4744]: I0311 01:25:16.974737 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:25:17 crc kubenswrapper[4744]: I0311 01:25:17.779662 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599"} Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.181322 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553206-n2dqc"] Mar 11 01:26:00 crc kubenswrapper[4744]: E0311 01:26:00.182323 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3422d7c5-ba80-438a-a2e9-420fd3ba6030" containerName="oc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.182345 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3422d7c5-ba80-438a-a2e9-420fd3ba6030" containerName="oc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.182635 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3422d7c5-ba80-438a-a2e9-420fd3ba6030" containerName="oc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.183331 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.185013 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.185230 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.185635 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.187923 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553206-n2dqc"] Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.288081 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5sbh\" (UniqueName: \"kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh\") pod \"auto-csr-approver-29553206-n2dqc\" (UID: \"7b9114d1-71dd-4281-a95b-5c29066c6093\") " pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.389737 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5sbh\" (UniqueName: \"kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh\") pod \"auto-csr-approver-29553206-n2dqc\" (UID: \"7b9114d1-71dd-4281-a95b-5c29066c6093\") " pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.424843 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5sbh\" (UniqueName: \"kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh\") pod \"auto-csr-approver-29553206-n2dqc\" (UID: \"7b9114d1-71dd-4281-a95b-5c29066c6093\") " pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:00 crc kubenswrapper[4744]: I0311 01:26:00.514086 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:01 crc kubenswrapper[4744]: I0311 01:26:01.043734 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553206-n2dqc"] Mar 11 01:26:01 crc kubenswrapper[4744]: I0311 01:26:01.056138 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:26:01 crc kubenswrapper[4744]: I0311 01:26:01.175341 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" event={"ID":"7b9114d1-71dd-4281-a95b-5c29066c6093","Type":"ContainerStarted","Data":"2d16ff99ffb79a7f44b6b38847613bcddd6e3fc3acbfed8bec307483a0c8e3af"} Mar 11 01:26:03 crc kubenswrapper[4744]: I0311 01:26:03.195038 4744 generic.go:334] "Generic (PLEG): container finished" podID="7b9114d1-71dd-4281-a95b-5c29066c6093" containerID="d5ce76c7158b9f92c1c7120dfcf72df5750377fbe841fd6156d52d86ccb0ef77" exitCode=0 Mar 11 01:26:03 crc kubenswrapper[4744]: I0311 01:26:03.195162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" event={"ID":"7b9114d1-71dd-4281-a95b-5c29066c6093","Type":"ContainerDied","Data":"d5ce76c7158b9f92c1c7120dfcf72df5750377fbe841fd6156d52d86ccb0ef77"} Mar 11 01:26:04 crc kubenswrapper[4744]: I0311 01:26:04.567961 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:04 crc kubenswrapper[4744]: I0311 01:26:04.760064 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5sbh\" (UniqueName: \"kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh\") pod \"7b9114d1-71dd-4281-a95b-5c29066c6093\" (UID: \"7b9114d1-71dd-4281-a95b-5c29066c6093\") " Mar 11 01:26:04 crc kubenswrapper[4744]: I0311 01:26:04.768853 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh" (OuterVolumeSpecName: "kube-api-access-j5sbh") pod "7b9114d1-71dd-4281-a95b-5c29066c6093" (UID: "7b9114d1-71dd-4281-a95b-5c29066c6093"). InnerVolumeSpecName "kube-api-access-j5sbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:26:04 crc kubenswrapper[4744]: I0311 01:26:04.861955 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5sbh\" (UniqueName: \"kubernetes.io/projected/7b9114d1-71dd-4281-a95b-5c29066c6093-kube-api-access-j5sbh\") on node \"crc\" DevicePath \"\"" Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.213038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" event={"ID":"7b9114d1-71dd-4281-a95b-5c29066c6093","Type":"ContainerDied","Data":"2d16ff99ffb79a7f44b6b38847613bcddd6e3fc3acbfed8bec307483a0c8e3af"} Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.213072 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d16ff99ffb79a7f44b6b38847613bcddd6e3fc3acbfed8bec307483a0c8e3af" Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.213104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553206-n2dqc" Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.653960 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553200-7v85t"] Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.666330 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553200-7v85t"] Mar 11 01:26:05 crc kubenswrapper[4744]: I0311 01:26:05.985107 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c9f281-5118-4349-bd48-43287ffb8059" path="/var/lib/kubelet/pods/c5c9f281-5118-4349-bd48-43287ffb8059/volumes" Mar 11 01:26:11 crc kubenswrapper[4744]: I0311 01:26:11.237487 4744 scope.go:117] "RemoveContainer" containerID="ff841945ddeea975698cf30b38f5d17e3d5bfd015350bd4a068055055d217192" Mar 11 01:27:42 crc kubenswrapper[4744]: I0311 01:27:42.409297 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:27:42 crc kubenswrapper[4744]: I0311 01:27:42.410002 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.172825 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553208-djwwv"] Mar 11 01:28:00 crc kubenswrapper[4744]: E0311 01:28:00.173913 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9114d1-71dd-4281-a95b-5c29066c6093" containerName="oc" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.173937 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9114d1-71dd-4281-a95b-5c29066c6093" containerName="oc" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.174195 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9114d1-71dd-4281-a95b-5c29066c6093" containerName="oc" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.174888 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.178556 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.179019 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.181725 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.190629 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553208-djwwv"] Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.310265 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn74s\" (UniqueName: \"kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s\") pod \"auto-csr-approver-29553208-djwwv\" (UID: \"16ee6953-f4fc-4b81-9141-736585fb3c37\") " pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.412134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn74s\" (UniqueName: \"kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s\") pod \"auto-csr-approver-29553208-djwwv\" (UID: \"16ee6953-f4fc-4b81-9141-736585fb3c37\") " pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.439569 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn74s\" (UniqueName: \"kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s\") pod \"auto-csr-approver-29553208-djwwv\" (UID: \"16ee6953-f4fc-4b81-9141-736585fb3c37\") " pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.498663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:00 crc kubenswrapper[4744]: I0311 01:28:00.986089 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553208-djwwv"] Mar 11 01:28:01 crc kubenswrapper[4744]: I0311 01:28:01.915028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553208-djwwv" event={"ID":"16ee6953-f4fc-4b81-9141-736585fb3c37","Type":"ContainerStarted","Data":"2e327ef96cc35fc976e24528841e1caba9e23255ae088e47911e8e06a82298f7"} Mar 11 01:28:02 crc kubenswrapper[4744]: I0311 01:28:02.926842 4744 generic.go:334] "Generic (PLEG): container finished" podID="16ee6953-f4fc-4b81-9141-736585fb3c37" containerID="d395d4fdf3fd77e7ea06c8e82cf9f9f286bbcf698facf4a1ddcb74769673d81f" exitCode=0 Mar 11 01:28:02 crc kubenswrapper[4744]: I0311 01:28:02.926987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553208-djwwv" event={"ID":"16ee6953-f4fc-4b81-9141-736585fb3c37","Type":"ContainerDied","Data":"d395d4fdf3fd77e7ea06c8e82cf9f9f286bbcf698facf4a1ddcb74769673d81f"} Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.247741 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.370721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn74s\" (UniqueName: \"kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s\") pod \"16ee6953-f4fc-4b81-9141-736585fb3c37\" (UID: \"16ee6953-f4fc-4b81-9141-736585fb3c37\") " Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.380276 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s" (OuterVolumeSpecName: "kube-api-access-jn74s") pod "16ee6953-f4fc-4b81-9141-736585fb3c37" (UID: "16ee6953-f4fc-4b81-9141-736585fb3c37"). InnerVolumeSpecName "kube-api-access-jn74s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.473191 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn74s\" (UniqueName: \"kubernetes.io/projected/16ee6953-f4fc-4b81-9141-736585fb3c37-kube-api-access-jn74s\") on node \"crc\" DevicePath \"\"" Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.961143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553208-djwwv" event={"ID":"16ee6953-f4fc-4b81-9141-736585fb3c37","Type":"ContainerDied","Data":"2e327ef96cc35fc976e24528841e1caba9e23255ae088e47911e8e06a82298f7"} Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.961213 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e327ef96cc35fc976e24528841e1caba9e23255ae088e47911e8e06a82298f7" Mar 11 01:28:04 crc kubenswrapper[4744]: I0311 01:28:04.961292 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553208-djwwv" Mar 11 01:28:05 crc kubenswrapper[4744]: I0311 01:28:05.334808 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553202-cqp9j"] Mar 11 01:28:05 crc kubenswrapper[4744]: I0311 01:28:05.341583 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553202-cqp9j"] Mar 11 01:28:05 crc kubenswrapper[4744]: I0311 01:28:05.989200 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d271c5-b68d-4c0f-a525-d96c6b0053a3" path="/var/lib/kubelet/pods/c0d271c5-b68d-4c0f-a525-d96c6b0053a3/volumes" Mar 11 01:28:11 crc kubenswrapper[4744]: I0311 01:28:11.353603 4744 scope.go:117] "RemoveContainer" containerID="739127757cd10c744d4c6dfba47db228b6f0026224f80cb3c0973501e5d10f1b" Mar 11 01:28:12 crc kubenswrapper[4744]: I0311 01:28:12.410250 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:28:12 crc kubenswrapper[4744]: I0311 01:28:12.410364 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:28:42 crc kubenswrapper[4744]: I0311 01:28:42.409464 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:28:42 crc kubenswrapper[4744]: I0311 01:28:42.410303 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:28:42 crc kubenswrapper[4744]: I0311 01:28:42.410454 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:28:42 crc kubenswrapper[4744]: I0311 01:28:42.411702 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:28:42 crc kubenswrapper[4744]: I0311 01:28:42.411829 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599" gracePeriod=600 Mar 11 01:28:43 crc kubenswrapper[4744]: I0311 01:28:43.323105 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599" exitCode=0 Mar 11 01:28:43 crc kubenswrapper[4744]: I0311 01:28:43.323412 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599"} Mar 11 01:28:43 crc kubenswrapper[4744]: I0311 01:28:43.323958 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09"} Mar 11 01:28:43 crc kubenswrapper[4744]: I0311 01:28:43.323997 4744 scope.go:117] "RemoveContainer" containerID="0473256ff02bdf69162deacfefe5387c093ffe2e643f7878e13771cc69dd585b" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.182116 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553210-4vgls"] Mar 11 01:30:00 crc kubenswrapper[4744]: E0311 01:30:00.183422 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ee6953-f4fc-4b81-9141-736585fb3c37" containerName="oc" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.183452 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ee6953-f4fc-4b81-9141-736585fb3c37" containerName="oc" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.183841 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ee6953-f4fc-4b81-9141-736585fb3c37" containerName="oc" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.184845 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.189822 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.190269 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.191453 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm"] Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.192408 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.196333 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.196807 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.198577 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.211485 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm"] Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.219801 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553210-4vgls"] Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.367996 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.368092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dt4w\" (UniqueName: \"kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w\") pod \"auto-csr-approver-29553210-4vgls\" (UID: \"3d4c219b-834d-47cd-97aa-cf789dd2b8ce\") " pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.368125 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fkq2\" (UniqueName: \"kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.368204 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.469667 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.469766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.469804 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dt4w\" (UniqueName: \"kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w\") pod \"auto-csr-approver-29553210-4vgls\" (UID: \"3d4c219b-834d-47cd-97aa-cf789dd2b8ce\") " pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.469838 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fkq2\" (UniqueName: \"kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.472263 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.479355 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.498221 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dt4w\" (UniqueName: \"kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w\") pod \"auto-csr-approver-29553210-4vgls\" (UID: \"3d4c219b-834d-47cd-97aa-cf789dd2b8ce\") " pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.501435 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fkq2\" (UniqueName: \"kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2\") pod \"collect-profiles-29553210-6jzxm\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.521856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:00 crc kubenswrapper[4744]: I0311 01:30:00.539786 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:01 crc kubenswrapper[4744]: I0311 01:30:01.053916 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553210-4vgls"] Mar 11 01:30:01 crc kubenswrapper[4744]: I0311 01:30:01.113405 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm"] Mar 11 01:30:01 crc kubenswrapper[4744]: W0311 01:30:01.115615 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85fad93b_fbcd_4de5_a08a_50d81b03f1c3.slice/crio-d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727 WatchSource:0}: Error finding container d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727: Status 404 returned error can't find the container with id d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727 Mar 11 01:30:02 crc kubenswrapper[4744]: I0311 01:30:02.062634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553210-4vgls" event={"ID":"3d4c219b-834d-47cd-97aa-cf789dd2b8ce","Type":"ContainerStarted","Data":"9c243a37e39597185d7d3eec65dec299979a4dd3a79fd52a7bf33b1d1c1232a7"} Mar 11 01:30:02 crc kubenswrapper[4744]: I0311 01:30:02.064431 4744 generic.go:334] "Generic (PLEG): container finished" podID="85fad93b-fbcd-4de5-a08a-50d81b03f1c3" containerID="68f772e4c6ce843add7dac55131b742ad8eb2c2ab6bfafc3af5b4f00ec5575e1" exitCode=0 Mar 11 01:30:02 crc kubenswrapper[4744]: I0311 01:30:02.064455 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" event={"ID":"85fad93b-fbcd-4de5-a08a-50d81b03f1c3","Type":"ContainerDied","Data":"68f772e4c6ce843add7dac55131b742ad8eb2c2ab6bfafc3af5b4f00ec5575e1"} Mar 11 01:30:02 crc kubenswrapper[4744]: I0311 01:30:02.064471 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" event={"ID":"85fad93b-fbcd-4de5-a08a-50d81b03f1c3","Type":"ContainerStarted","Data":"d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727"} Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.398734 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.522948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fkq2\" (UniqueName: \"kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2\") pod \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.523118 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume\") pod \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.523154 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume\") pod \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\" (UID: \"85fad93b-fbcd-4de5-a08a-50d81b03f1c3\") " Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.524199 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "85fad93b-fbcd-4de5-a08a-50d81b03f1c3" (UID: "85fad93b-fbcd-4de5-a08a-50d81b03f1c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.529595 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85fad93b-fbcd-4de5-a08a-50d81b03f1c3" (UID: "85fad93b-fbcd-4de5-a08a-50d81b03f1c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.529820 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2" (OuterVolumeSpecName: "kube-api-access-7fkq2") pod "85fad93b-fbcd-4de5-a08a-50d81b03f1c3" (UID: "85fad93b-fbcd-4de5-a08a-50d81b03f1c3"). InnerVolumeSpecName "kube-api-access-7fkq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.625812 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.625868 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:03 crc kubenswrapper[4744]: I0311 01:30:03.625894 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fkq2\" (UniqueName: \"kubernetes.io/projected/85fad93b-fbcd-4de5-a08a-50d81b03f1c3-kube-api-access-7fkq2\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:04 crc kubenswrapper[4744]: I0311 01:30:04.085340 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" Mar 11 01:30:04 crc kubenswrapper[4744]: I0311 01:30:04.086652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm" event={"ID":"85fad93b-fbcd-4de5-a08a-50d81b03f1c3","Type":"ContainerDied","Data":"d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727"} Mar 11 01:30:04 crc kubenswrapper[4744]: I0311 01:30:04.086787 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9033c97ac885de292c528f4d83dbde422bc3b14eff90bc18aa6f13edd7c3727" Mar 11 01:30:04 crc kubenswrapper[4744]: I0311 01:30:04.503608 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9"] Mar 11 01:30:04 crc kubenswrapper[4744]: I0311 01:30:04.516452 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553165-4gpb9"] Mar 11 01:30:05 crc kubenswrapper[4744]: I0311 01:30:05.095174 4744 generic.go:334] "Generic (PLEG): container finished" podID="3d4c219b-834d-47cd-97aa-cf789dd2b8ce" containerID="6237930b99e4514f1ce9b020264cf7e1ba6c76e65bf7a49d064b5217917812bf" exitCode=0 Mar 11 01:30:05 crc kubenswrapper[4744]: I0311 01:30:05.095346 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553210-4vgls" event={"ID":"3d4c219b-834d-47cd-97aa-cf789dd2b8ce","Type":"ContainerDied","Data":"6237930b99e4514f1ce9b020264cf7e1ba6c76e65bf7a49d064b5217917812bf"} Mar 11 01:30:05 crc kubenswrapper[4744]: I0311 01:30:05.992642 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7886a95d-050e-4d58-baf1-e65310e95e4f" path="/var/lib/kubelet/pods/7886a95d-050e-4d58-baf1-e65310e95e4f/volumes" Mar 11 01:30:06 crc kubenswrapper[4744]: I0311 01:30:06.837827 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:06 crc kubenswrapper[4744]: I0311 01:30:06.907094 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dt4w\" (UniqueName: \"kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w\") pod \"3d4c219b-834d-47cd-97aa-cf789dd2b8ce\" (UID: \"3d4c219b-834d-47cd-97aa-cf789dd2b8ce\") " Mar 11 01:30:06 crc kubenswrapper[4744]: I0311 01:30:06.913618 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w" (OuterVolumeSpecName: "kube-api-access-6dt4w") pod "3d4c219b-834d-47cd-97aa-cf789dd2b8ce" (UID: "3d4c219b-834d-47cd-97aa-cf789dd2b8ce"). InnerVolumeSpecName "kube-api-access-6dt4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.008950 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dt4w\" (UniqueName: \"kubernetes.io/projected/3d4c219b-834d-47cd-97aa-cf789dd2b8ce-kube-api-access-6dt4w\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.116697 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553210-4vgls" event={"ID":"3d4c219b-834d-47cd-97aa-cf789dd2b8ce","Type":"ContainerDied","Data":"9c243a37e39597185d7d3eec65dec299979a4dd3a79fd52a7bf33b1d1c1232a7"} Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.116755 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c243a37e39597185d7d3eec65dec299979a4dd3a79fd52a7bf33b1d1c1232a7" Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.116785 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553210-4vgls" Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.904384 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553204-l2rr2"] Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.916287 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553204-l2rr2"] Mar 11 01:30:07 crc kubenswrapper[4744]: I0311 01:30:07.984753 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3422d7c5-ba80-438a-a2e9-420fd3ba6030" path="/var/lib/kubelet/pods/3422d7c5-ba80-438a-a2e9-420fd3ba6030/volumes" Mar 11 01:30:11 crc kubenswrapper[4744]: I0311 01:30:11.447986 4744 scope.go:117] "RemoveContainer" containerID="027fe3efb836951232e1afa930e3f986b4fabc10c31a5c88ae0a8c227eca79d0" Mar 11 01:30:11 crc kubenswrapper[4744]: I0311 01:30:11.486313 4744 scope.go:117] "RemoveContainer" containerID="a4520cb01287ba761e907697dc32c7b92d688cd881fb12f92f86025c6d667156" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.186608 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:39 crc kubenswrapper[4744]: E0311 01:30:39.191172 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fad93b-fbcd-4de5-a08a-50d81b03f1c3" containerName="collect-profiles" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.191217 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fad93b-fbcd-4de5-a08a-50d81b03f1c3" containerName="collect-profiles" Mar 11 01:30:39 crc kubenswrapper[4744]: E0311 01:30:39.191248 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4c219b-834d-47cd-97aa-cf789dd2b8ce" containerName="oc" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.191261 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4c219b-834d-47cd-97aa-cf789dd2b8ce" containerName="oc" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.191576 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4c219b-834d-47cd-97aa-cf789dd2b8ce" containerName="oc" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.191630 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fad93b-fbcd-4de5-a08a-50d81b03f1c3" containerName="collect-profiles" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.193333 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.208061 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.250196 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvthk\" (UniqueName: \"kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.250333 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.250438 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.352263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvthk\" (UniqueName: \"kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.352377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.352591 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.352991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.353297 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.383184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvthk\" (UniqueName: \"kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk\") pod \"redhat-marketplace-p79vw\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.543405 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:39 crc kubenswrapper[4744]: I0311 01:30:39.835504 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:40 crc kubenswrapper[4744]: I0311 01:30:40.445907 4744 generic.go:334] "Generic (PLEG): container finished" podID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerID="1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3" exitCode=0 Mar 11 01:30:40 crc kubenswrapper[4744]: I0311 01:30:40.446148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerDied","Data":"1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3"} Mar 11 01:30:40 crc kubenswrapper[4744]: I0311 01:30:40.446316 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerStarted","Data":"b841186a8b821ce19c254469902f0a3a1b2076236ac8f22d6f71f3078a8b6cc1"} Mar 11 01:30:42 crc kubenswrapper[4744]: I0311 01:30:42.409153 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:30:42 crc kubenswrapper[4744]: I0311 01:30:42.409615 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:30:42 crc kubenswrapper[4744]: I0311 01:30:42.496841 4744 generic.go:334] "Generic (PLEG): container finished" podID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerID="1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1" exitCode=0 Mar 11 01:30:42 crc kubenswrapper[4744]: I0311 01:30:42.496924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerDied","Data":"1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1"} Mar 11 01:30:43 crc kubenswrapper[4744]: I0311 01:30:43.506807 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerStarted","Data":"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f"} Mar 11 01:30:43 crc kubenswrapper[4744]: I0311 01:30:43.537210 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p79vw" podStartSLOduration=2.047941384 podStartE2EDuration="4.537185206s" podCreationTimestamp="2026-03-11 01:30:39 +0000 UTC" firstStartedPulling="2026-03-11 01:30:40.44802097 +0000 UTC m=+2197.252238615" lastFinishedPulling="2026-03-11 01:30:42.937264792 +0000 UTC m=+2199.741482437" observedRunningTime="2026-03-11 01:30:43.525694741 +0000 UTC m=+2200.329912356" watchObservedRunningTime="2026-03-11 01:30:43.537185206 +0000 UTC m=+2200.341402841" Mar 11 01:30:49 crc kubenswrapper[4744]: I0311 01:30:49.544498 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:49 crc kubenswrapper[4744]: I0311 01:30:49.545356 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:49 crc kubenswrapper[4744]: I0311 01:30:49.628325 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:49 crc kubenswrapper[4744]: I0311 01:30:49.706436 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:49 crc kubenswrapper[4744]: I0311 01:30:49.883947 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:51 crc kubenswrapper[4744]: I0311 01:30:51.581227 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p79vw" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="registry-server" containerID="cri-o://963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f" gracePeriod=2 Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.090394 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.256856 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities\") pod \"7fca0c9f-b720-4626-b497-dbfe25eacd99\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.257109 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content\") pod \"7fca0c9f-b720-4626-b497-dbfe25eacd99\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.257241 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvthk\" (UniqueName: \"kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk\") pod \"7fca0c9f-b720-4626-b497-dbfe25eacd99\" (UID: \"7fca0c9f-b720-4626-b497-dbfe25eacd99\") " Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.258194 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities" (OuterVolumeSpecName: "utilities") pod "7fca0c9f-b720-4626-b497-dbfe25eacd99" (UID: "7fca0c9f-b720-4626-b497-dbfe25eacd99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.267034 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk" (OuterVolumeSpecName: "kube-api-access-vvthk") pod "7fca0c9f-b720-4626-b497-dbfe25eacd99" (UID: "7fca0c9f-b720-4626-b497-dbfe25eacd99"). InnerVolumeSpecName "kube-api-access-vvthk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.296645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fca0c9f-b720-4626-b497-dbfe25eacd99" (UID: "7fca0c9f-b720-4626-b497-dbfe25eacd99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.376438 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.376493 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvthk\" (UniqueName: \"kubernetes.io/projected/7fca0c9f-b720-4626-b497-dbfe25eacd99-kube-api-access-vvthk\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.376531 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fca0c9f-b720-4626-b497-dbfe25eacd99-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.593391 4744 generic.go:334] "Generic (PLEG): container finished" podID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerID="963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f" exitCode=0 Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.593448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerDied","Data":"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f"} Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.593480 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p79vw" event={"ID":"7fca0c9f-b720-4626-b497-dbfe25eacd99","Type":"ContainerDied","Data":"b841186a8b821ce19c254469902f0a3a1b2076236ac8f22d6f71f3078a8b6cc1"} Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.593501 4744 scope.go:117] "RemoveContainer" containerID="963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.593631 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p79vw" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.621291 4744 scope.go:117] "RemoveContainer" containerID="1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.642214 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.647424 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p79vw"] Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.664225 4744 scope.go:117] "RemoveContainer" containerID="1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.688210 4744 scope.go:117] "RemoveContainer" containerID="963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f" Mar 11 01:30:52 crc kubenswrapper[4744]: E0311 01:30:52.688633 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f\": container with ID starting with 963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f not found: ID does not exist" containerID="963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.688674 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f"} err="failed to get container status \"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f\": rpc error: code = NotFound desc = could not find container \"963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f\": container with ID starting with 963676279c255857aa62baece50eda4a38f9e8af76d09932f77561981587791f not found: ID does not exist" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.688700 4744 scope.go:117] "RemoveContainer" containerID="1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1" Mar 11 01:30:52 crc kubenswrapper[4744]: E0311 01:30:52.688936 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1\": container with ID starting with 1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1 not found: ID does not exist" containerID="1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.688962 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1"} err="failed to get container status \"1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1\": rpc error: code = NotFound desc = could not find container \"1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1\": container with ID starting with 1d67042fe7b4b0806d338125de81afa499d41f58a2609e2ee3f8b3962e9761b1 not found: ID does not exist" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.688980 4744 scope.go:117] "RemoveContainer" containerID="1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3" Mar 11 01:30:52 crc kubenswrapper[4744]: E0311 01:30:52.689370 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3\": container with ID starting with 1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3 not found: ID does not exist" containerID="1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3" Mar 11 01:30:52 crc kubenswrapper[4744]: I0311 01:30:52.689396 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3"} err="failed to get container status \"1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3\": rpc error: code = NotFound desc = could not find container \"1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3\": container with ID starting with 1bdacb2a1f7bd868ebc6794d597e51d10461dac62b4e5fc0c860e661293b30a3 not found: ID does not exist" Mar 11 01:30:53 crc kubenswrapper[4744]: I0311 01:30:53.989300 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" path="/var/lib/kubelet/pods/7fca0c9f-b720-4626-b497-dbfe25eacd99/volumes" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.699875 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:30:57 crc kubenswrapper[4744]: E0311 01:30:57.700864 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="extract-content" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.700895 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="extract-content" Mar 11 01:30:57 crc kubenswrapper[4744]: E0311 01:30:57.700931 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="registry-server" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.700946 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="registry-server" Mar 11 01:30:57 crc kubenswrapper[4744]: E0311 01:30:57.700969 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="extract-utilities" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.700985 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="extract-utilities" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.701318 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca0c9f-b720-4626-b497-dbfe25eacd99" containerName="registry-server" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.704229 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.752606 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.759811 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.760021 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.760178 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9z8j\" (UniqueName: \"kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.862157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9z8j\" (UniqueName: \"kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.862596 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.862652 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.863225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.863421 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:57 crc kubenswrapper[4744]: I0311 01:30:57.893695 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9z8j\" (UniqueName: \"kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j\") pod \"redhat-operators-l7852\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:58 crc kubenswrapper[4744]: I0311 01:30:58.041580 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:30:58 crc kubenswrapper[4744]: I0311 01:30:58.514030 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:30:58 crc kubenswrapper[4744]: I0311 01:30:58.653194 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerStarted","Data":"37b396912c0ce699f6a01e99bc4e6205cf1add6d8b239fd53c2c44b2ee72488f"} Mar 11 01:30:59 crc kubenswrapper[4744]: I0311 01:30:59.664093 4744 generic.go:334] "Generic (PLEG): container finished" podID="391122a8-9803-4bd7-a723-63af96eac741" containerID="d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83" exitCode=0 Mar 11 01:30:59 crc kubenswrapper[4744]: I0311 01:30:59.664168 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerDied","Data":"d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83"} Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.089442 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.096462 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.098611 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.195290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.195335 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f8b\" (UniqueName: \"kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.195381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.296898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.296941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f8b\" (UniqueName: \"kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.296985 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.297380 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.297643 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.321253 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f8b\" (UniqueName: \"kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b\") pod \"certified-operators-dzpx4\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.423090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.671547 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerStarted","Data":"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac"} Mar 11 01:31:00 crc kubenswrapper[4744]: I0311 01:31:00.853415 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.683143 4744 generic.go:334] "Generic (PLEG): container finished" podID="391122a8-9803-4bd7-a723-63af96eac741" containerID="9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac" exitCode=0 Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.683292 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerDied","Data":"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac"} Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.685990 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.687915 4744 generic.go:334] "Generic (PLEG): container finished" podID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerID="20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b" exitCode=0 Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.687959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerDied","Data":"20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b"} Mar 11 01:31:01 crc kubenswrapper[4744]: I0311 01:31:01.687987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerStarted","Data":"155388285a397999599eb942ad1e071c60a67fdb8180d2734538d94029d1787c"} Mar 11 01:31:02 crc kubenswrapper[4744]: I0311 01:31:02.697080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerStarted","Data":"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef"} Mar 11 01:31:02 crc kubenswrapper[4744]: I0311 01:31:02.699079 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerStarted","Data":"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7"} Mar 11 01:31:02 crc kubenswrapper[4744]: I0311 01:31:02.766070 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7852" podStartSLOduration=3.346679722 podStartE2EDuration="5.766044181s" podCreationTimestamp="2026-03-11 01:30:57 +0000 UTC" firstStartedPulling="2026-03-11 01:30:59.667343929 +0000 UTC m=+2216.471561534" lastFinishedPulling="2026-03-11 01:31:02.086708348 +0000 UTC m=+2218.890925993" observedRunningTime="2026-03-11 01:31:02.760445398 +0000 UTC m=+2219.564663003" watchObservedRunningTime="2026-03-11 01:31:02.766044181 +0000 UTC m=+2219.570261806" Mar 11 01:31:03 crc kubenswrapper[4744]: I0311 01:31:03.708191 4744 generic.go:334] "Generic (PLEG): container finished" podID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerID="0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef" exitCode=0 Mar 11 01:31:03 crc kubenswrapper[4744]: I0311 01:31:03.708280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerDied","Data":"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef"} Mar 11 01:31:04 crc kubenswrapper[4744]: I0311 01:31:04.716213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerStarted","Data":"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2"} Mar 11 01:31:04 crc kubenswrapper[4744]: I0311 01:31:04.737485 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzpx4" podStartSLOduration=2.1758936 podStartE2EDuration="4.737465925s" podCreationTimestamp="2026-03-11 01:31:00 +0000 UTC" firstStartedPulling="2026-03-11 01:31:01.692984285 +0000 UTC m=+2218.497201900" lastFinishedPulling="2026-03-11 01:31:04.25455661 +0000 UTC m=+2221.058774225" observedRunningTime="2026-03-11 01:31:04.731857902 +0000 UTC m=+2221.536075507" watchObservedRunningTime="2026-03-11 01:31:04.737465925 +0000 UTC m=+2221.541683530" Mar 11 01:31:08 crc kubenswrapper[4744]: I0311 01:31:08.042033 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:08 crc kubenswrapper[4744]: I0311 01:31:08.042611 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:09 crc kubenswrapper[4744]: I0311 01:31:09.102410 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l7852" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="registry-server" probeResult="failure" output=< Mar 11 01:31:09 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:31:09 crc kubenswrapper[4744]: > Mar 11 01:31:10 crc kubenswrapper[4744]: I0311 01:31:10.423592 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:10 crc kubenswrapper[4744]: I0311 01:31:10.424083 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:10 crc kubenswrapper[4744]: I0311 01:31:10.469707 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:10 crc kubenswrapper[4744]: I0311 01:31:10.809905 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:10 crc kubenswrapper[4744]: I0311 01:31:10.876421 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:12 crc kubenswrapper[4744]: I0311 01:31:12.409623 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:31:12 crc kubenswrapper[4744]: I0311 01:31:12.409833 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:31:12 crc kubenswrapper[4744]: I0311 01:31:12.781063 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzpx4" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="registry-server" containerID="cri-o://93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2" gracePeriod=2 Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.306601 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.386480 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities\") pod \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.386766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4f8b\" (UniqueName: \"kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b\") pod \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.386854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content\") pod \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\" (UID: \"8da2a7e6-763a-458f-8750-0b4eb5e1c72f\") " Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.391385 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities" (OuterVolumeSpecName: "utilities") pod "8da2a7e6-763a-458f-8750-0b4eb5e1c72f" (UID: "8da2a7e6-763a-458f-8750-0b4eb5e1c72f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.404870 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b" (OuterVolumeSpecName: "kube-api-access-k4f8b") pod "8da2a7e6-763a-458f-8750-0b4eb5e1c72f" (UID: "8da2a7e6-763a-458f-8750-0b4eb5e1c72f"). InnerVolumeSpecName "kube-api-access-k4f8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.489254 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4f8b\" (UniqueName: \"kubernetes.io/projected/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-kube-api-access-k4f8b\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.489293 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.753539 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8da2a7e6-763a-458f-8750-0b4eb5e1c72f" (UID: "8da2a7e6-763a-458f-8750-0b4eb5e1c72f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.792600 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da2a7e6-763a-458f-8750-0b4eb5e1c72f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.796543 4744 generic.go:334] "Generic (PLEG): container finished" podID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerID="93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2" exitCode=0 Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.796574 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerDied","Data":"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2"} Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.796597 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzpx4" event={"ID":"8da2a7e6-763a-458f-8750-0b4eb5e1c72f","Type":"ContainerDied","Data":"155388285a397999599eb942ad1e071c60a67fdb8180d2734538d94029d1787c"} Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.796612 4744 scope.go:117] "RemoveContainer" containerID="93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.796705 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzpx4" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.827539 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.834422 4744 scope.go:117] "RemoveContainer" containerID="0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.839530 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzpx4"] Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.869019 4744 scope.go:117] "RemoveContainer" containerID="20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.899618 4744 scope.go:117] "RemoveContainer" containerID="93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2" Mar 11 01:31:13 crc kubenswrapper[4744]: E0311 01:31:13.900040 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2\": container with ID starting with 93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2 not found: ID does not exist" containerID="93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.900081 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2"} err="failed to get container status \"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2\": rpc error: code = NotFound desc = could not find container \"93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2\": container with ID starting with 93b6d2113cab94702f9e60de5989734232eca0421825d5328bd4427c8aca8fd2 not found: ID does not exist" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.900107 4744 scope.go:117] "RemoveContainer" containerID="0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef" Mar 11 01:31:13 crc kubenswrapper[4744]: E0311 01:31:13.900502 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef\": container with ID starting with 0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef not found: ID does not exist" containerID="0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.900545 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef"} err="failed to get container status \"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef\": rpc error: code = NotFound desc = could not find container \"0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef\": container with ID starting with 0c41eacf071bad6e3f6390c4640fff8ca1a7368d394ea633f67168bebf62d5ef not found: ID does not exist" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.900583 4744 scope.go:117] "RemoveContainer" containerID="20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b" Mar 11 01:31:13 crc kubenswrapper[4744]: E0311 01:31:13.900992 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b\": container with ID starting with 20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b not found: ID does not exist" containerID="20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.901031 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b"} err="failed to get container status \"20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b\": rpc error: code = NotFound desc = could not find container \"20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b\": container with ID starting with 20e4c6b5839e49462afa0c368f55cf4ecd7d287307ee42164a9f9f572988681b not found: ID does not exist" Mar 11 01:31:13 crc kubenswrapper[4744]: I0311 01:31:13.993215 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" path="/var/lib/kubelet/pods/8da2a7e6-763a-458f-8750-0b4eb5e1c72f/volumes" Mar 11 01:31:18 crc kubenswrapper[4744]: I0311 01:31:18.113586 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:18 crc kubenswrapper[4744]: I0311 01:31:18.188441 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:18 crc kubenswrapper[4744]: I0311 01:31:18.366809 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:31:19 crc kubenswrapper[4744]: I0311 01:31:19.849764 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7852" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="registry-server" containerID="cri-o://6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7" gracePeriod=2 Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.305567 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.394844 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities\") pod \"391122a8-9803-4bd7-a723-63af96eac741\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.395005 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content\") pod \"391122a8-9803-4bd7-a723-63af96eac741\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.395063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9z8j\" (UniqueName: \"kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j\") pod \"391122a8-9803-4bd7-a723-63af96eac741\" (UID: \"391122a8-9803-4bd7-a723-63af96eac741\") " Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.395875 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities" (OuterVolumeSpecName: "utilities") pod "391122a8-9803-4bd7-a723-63af96eac741" (UID: "391122a8-9803-4bd7-a723-63af96eac741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.427978 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j" (OuterVolumeSpecName: "kube-api-access-z9z8j") pod "391122a8-9803-4bd7-a723-63af96eac741" (UID: "391122a8-9803-4bd7-a723-63af96eac741"). InnerVolumeSpecName "kube-api-access-z9z8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.496764 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9z8j\" (UniqueName: \"kubernetes.io/projected/391122a8-9803-4bd7-a723-63af96eac741-kube-api-access-z9z8j\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.496805 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.570779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "391122a8-9803-4bd7-a723-63af96eac741" (UID: "391122a8-9803-4bd7-a723-63af96eac741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.598228 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391122a8-9803-4bd7-a723-63af96eac741-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.860296 4744 generic.go:334] "Generic (PLEG): container finished" podID="391122a8-9803-4bd7-a723-63af96eac741" containerID="6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7" exitCode=0 Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.860391 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7852" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.860420 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerDied","Data":"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7"} Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.860813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7852" event={"ID":"391122a8-9803-4bd7-a723-63af96eac741","Type":"ContainerDied","Data":"37b396912c0ce699f6a01e99bc4e6205cf1add6d8b239fd53c2c44b2ee72488f"} Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.860849 4744 scope.go:117] "RemoveContainer" containerID="6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.900970 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.901610 4744 scope.go:117] "RemoveContainer" containerID="9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.906967 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7852"] Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.924878 4744 scope.go:117] "RemoveContainer" containerID="d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.949504 4744 scope.go:117] "RemoveContainer" containerID="6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7" Mar 11 01:31:20 crc kubenswrapper[4744]: E0311 01:31:20.949955 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7\": container with ID starting with 6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7 not found: ID does not exist" containerID="6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.949982 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7"} err="failed to get container status \"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7\": rpc error: code = NotFound desc = could not find container \"6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7\": container with ID starting with 6d4a31582aabd82c9b26e8d69f1f8120e9c54077a95e05ed3132ffad77b469f7 not found: ID does not exist" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.950003 4744 scope.go:117] "RemoveContainer" containerID="9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac" Mar 11 01:31:20 crc kubenswrapper[4744]: E0311 01:31:20.950456 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac\": container with ID starting with 9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac not found: ID does not exist" containerID="9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.950684 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac"} err="failed to get container status \"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac\": rpc error: code = NotFound desc = could not find container \"9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac\": container with ID starting with 9df3e8ef899627780a847fc4ee55ae46318e3a9841043dda3ddab74ffce8ddac not found: ID does not exist" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.950839 4744 scope.go:117] "RemoveContainer" containerID="d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83" Mar 11 01:31:20 crc kubenswrapper[4744]: E0311 01:31:20.951492 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83\": container with ID starting with d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83 not found: ID does not exist" containerID="d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83" Mar 11 01:31:20 crc kubenswrapper[4744]: I0311 01:31:20.951558 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83"} err="failed to get container status \"d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83\": rpc error: code = NotFound desc = could not find container \"d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83\": container with ID starting with d17f41002cfdb6eb2abea333d9864b5e455a0731e2ff65fd87eced1f62b31e83 not found: ID does not exist" Mar 11 01:31:21 crc kubenswrapper[4744]: I0311 01:31:21.990252 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391122a8-9803-4bd7-a723-63af96eac741" path="/var/lib/kubelet/pods/391122a8-9803-4bd7-a723-63af96eac741/volumes" Mar 11 01:31:42 crc kubenswrapper[4744]: I0311 01:31:42.409486 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:31:42 crc kubenswrapper[4744]: I0311 01:31:42.410166 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:31:42 crc kubenswrapper[4744]: I0311 01:31:42.410235 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:31:42 crc kubenswrapper[4744]: I0311 01:31:42.411240 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:31:42 crc kubenswrapper[4744]: I0311 01:31:42.411335 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" gracePeriod=600 Mar 11 01:31:42 crc kubenswrapper[4744]: E0311 01:31:42.541280 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:31:43 crc kubenswrapper[4744]: I0311 01:31:43.077153 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" exitCode=0 Mar 11 01:31:43 crc kubenswrapper[4744]: I0311 01:31:43.077234 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09"} Mar 11 01:31:43 crc kubenswrapper[4744]: I0311 01:31:43.077294 4744 scope.go:117] "RemoveContainer" containerID="69edf655e778db6d1e81f55a954739ba06410685289317ed77b6037fb685d599" Mar 11 01:31:43 crc kubenswrapper[4744]: I0311 01:31:43.078904 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:31:43 crc kubenswrapper[4744]: E0311 01:31:43.079382 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:31:54 crc kubenswrapper[4744]: I0311 01:31:54.975023 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:31:54 crc kubenswrapper[4744]: E0311 01:31:54.976123 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.164570 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553212-7745p"] Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165364 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="extract-utilities" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165389 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="extract-utilities" Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165416 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="extract-content" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165428 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="extract-content" Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165455 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165472 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165509 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="extract-utilities" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165557 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="extract-utilities" Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165584 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165601 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: E0311 01:32:00.165621 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="extract-content" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165633 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="extract-content" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165883 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="391122a8-9803-4bd7-a723-63af96eac741" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.165928 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da2a7e6-763a-458f-8750-0b4eb5e1c72f" containerName="registry-server" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.166595 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.177979 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.177996 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.178051 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.186261 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553212-7745p"] Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.227403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrdv\" (UniqueName: \"kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv\") pod \"auto-csr-approver-29553212-7745p\" (UID: \"8d898cd4-7543-4bff-b1e3-09ec55c8f223\") " pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.329248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrdv\" (UniqueName: \"kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv\") pod \"auto-csr-approver-29553212-7745p\" (UID: \"8d898cd4-7543-4bff-b1e3-09ec55c8f223\") " pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.355523 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrdv\" (UniqueName: \"kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv\") pod \"auto-csr-approver-29553212-7745p\" (UID: \"8d898cd4-7543-4bff-b1e3-09ec55c8f223\") " pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.498349 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:00 crc kubenswrapper[4744]: I0311 01:32:00.980108 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553212-7745p"] Mar 11 01:32:01 crc kubenswrapper[4744]: I0311 01:32:01.238689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553212-7745p" event={"ID":"8d898cd4-7543-4bff-b1e3-09ec55c8f223","Type":"ContainerStarted","Data":"317ace4a3e83eedeb5a71d99c2a07bfdb4d2881fb2310f3ac361de1c63515573"} Mar 11 01:32:03 crc kubenswrapper[4744]: I0311 01:32:03.263274 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d898cd4-7543-4bff-b1e3-09ec55c8f223" containerID="e017e42edc068278bde4a088efa25c5ed3e0a76d2c1ff8c5c049bb90b9558345" exitCode=0 Mar 11 01:32:03 crc kubenswrapper[4744]: I0311 01:32:03.263917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553212-7745p" event={"ID":"8d898cd4-7543-4bff-b1e3-09ec55c8f223","Type":"ContainerDied","Data":"e017e42edc068278bde4a088efa25c5ed3e0a76d2c1ff8c5c049bb90b9558345"} Mar 11 01:32:04 crc kubenswrapper[4744]: I0311 01:32:04.640837 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:04 crc kubenswrapper[4744]: I0311 01:32:04.701073 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkrdv\" (UniqueName: \"kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv\") pod \"8d898cd4-7543-4bff-b1e3-09ec55c8f223\" (UID: \"8d898cd4-7543-4bff-b1e3-09ec55c8f223\") " Mar 11 01:32:04 crc kubenswrapper[4744]: I0311 01:32:04.710382 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv" (OuterVolumeSpecName: "kube-api-access-mkrdv") pod "8d898cd4-7543-4bff-b1e3-09ec55c8f223" (UID: "8d898cd4-7543-4bff-b1e3-09ec55c8f223"). InnerVolumeSpecName "kube-api-access-mkrdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:32:04 crc kubenswrapper[4744]: I0311 01:32:04.803636 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkrdv\" (UniqueName: \"kubernetes.io/projected/8d898cd4-7543-4bff-b1e3-09ec55c8f223-kube-api-access-mkrdv\") on node \"crc\" DevicePath \"\"" Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.285677 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553212-7745p" event={"ID":"8d898cd4-7543-4bff-b1e3-09ec55c8f223","Type":"ContainerDied","Data":"317ace4a3e83eedeb5a71d99c2a07bfdb4d2881fb2310f3ac361de1c63515573"} Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.285729 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="317ace4a3e83eedeb5a71d99c2a07bfdb4d2881fb2310f3ac361de1c63515573" Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.285751 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553212-7745p" Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.736733 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553206-n2dqc"] Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.748875 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553206-n2dqc"] Mar 11 01:32:05 crc kubenswrapper[4744]: I0311 01:32:05.991373 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9114d1-71dd-4281-a95b-5c29066c6093" path="/var/lib/kubelet/pods/7b9114d1-71dd-4281-a95b-5c29066c6093/volumes" Mar 11 01:32:06 crc kubenswrapper[4744]: I0311 01:32:06.974793 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:32:06 crc kubenswrapper[4744]: E0311 01:32:06.975435 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:11 crc kubenswrapper[4744]: I0311 01:32:11.646572 4744 scope.go:117] "RemoveContainer" containerID="d5ce76c7158b9f92c1c7120dfcf72df5750377fbe841fd6156d52d86ccb0ef77" Mar 11 01:32:20 crc kubenswrapper[4744]: I0311 01:32:20.975342 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:32:20 crc kubenswrapper[4744]: E0311 01:32:20.976441 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:32 crc kubenswrapper[4744]: I0311 01:32:32.974971 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:32:32 crc kubenswrapper[4744]: E0311 01:32:32.975813 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:43 crc kubenswrapper[4744]: I0311 01:32:43.979187 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:32:43 crc kubenswrapper[4744]: E0311 01:32:43.979899 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:54 crc kubenswrapper[4744]: I0311 01:32:54.975440 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:32:54 crc kubenswrapper[4744]: E0311 01:32:54.976431 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.034595 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:32:55 crc kubenswrapper[4744]: E0311 01:32:55.035154 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d898cd4-7543-4bff-b1e3-09ec55c8f223" containerName="oc" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.035187 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d898cd4-7543-4bff-b1e3-09ec55c8f223" containerName="oc" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.035560 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d898cd4-7543-4bff-b1e3-09ec55c8f223" containerName="oc" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.037619 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.054452 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.197331 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrm7t\" (UniqueName: \"kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.197412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.197446 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.298497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.298578 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.298634 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrm7t\" (UniqueName: \"kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.299097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.299208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.323054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrm7t\" (UniqueName: \"kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t\") pod \"community-operators-sb5rp\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.369479 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.653224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:32:55 crc kubenswrapper[4744]: I0311 01:32:55.738116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerStarted","Data":"bc6797a56a200c1d91e1bf59f6572e52fd68a4af129dbea23b8c17d7f047e2a9"} Mar 11 01:32:56 crc kubenswrapper[4744]: I0311 01:32:56.746075 4744 generic.go:334] "Generic (PLEG): container finished" podID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerID="e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624" exitCode=0 Mar 11 01:32:56 crc kubenswrapper[4744]: I0311 01:32:56.746123 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerDied","Data":"e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624"} Mar 11 01:32:57 crc kubenswrapper[4744]: I0311 01:32:57.754561 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerStarted","Data":"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4"} Mar 11 01:32:58 crc kubenswrapper[4744]: I0311 01:32:58.764346 4744 generic.go:334] "Generic (PLEG): container finished" podID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerID="1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4" exitCode=0 Mar 11 01:32:58 crc kubenswrapper[4744]: I0311 01:32:58.764400 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerDied","Data":"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4"} Mar 11 01:33:00 crc kubenswrapper[4744]: I0311 01:33:00.798428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerStarted","Data":"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b"} Mar 11 01:33:00 crc kubenswrapper[4744]: I0311 01:33:00.833947 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sb5rp" podStartSLOduration=2.18404926 podStartE2EDuration="5.833930891s" podCreationTimestamp="2026-03-11 01:32:55 +0000 UTC" firstStartedPulling="2026-03-11 01:32:56.749957202 +0000 UTC m=+2333.554174817" lastFinishedPulling="2026-03-11 01:33:00.399838813 +0000 UTC m=+2337.204056448" observedRunningTime="2026-03-11 01:33:00.831984011 +0000 UTC m=+2337.636201636" watchObservedRunningTime="2026-03-11 01:33:00.833930891 +0000 UTC m=+2337.638148496" Mar 11 01:33:05 crc kubenswrapper[4744]: I0311 01:33:05.372260 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:05 crc kubenswrapper[4744]: I0311 01:33:05.373697 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:05 crc kubenswrapper[4744]: I0311 01:33:05.449414 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:05 crc kubenswrapper[4744]: I0311 01:33:05.921366 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:07 crc kubenswrapper[4744]: I0311 01:33:07.974376 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:33:07 crc kubenswrapper[4744]: E0311 01:33:07.974803 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.220910 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.221242 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sb5rp" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="registry-server" containerID="cri-o://f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b" gracePeriod=2 Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.706524 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.799122 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content\") pod \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.799329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrm7t\" (UniqueName: \"kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t\") pod \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.800577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities\") pod \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\" (UID: \"dd5429a1-b64f-4748-a7d2-3255dfef8be2\") " Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.801284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities" (OuterVolumeSpecName: "utilities") pod "dd5429a1-b64f-4748-a7d2-3255dfef8be2" (UID: "dd5429a1-b64f-4748-a7d2-3255dfef8be2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.806167 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t" (OuterVolumeSpecName: "kube-api-access-vrm7t") pod "dd5429a1-b64f-4748-a7d2-3255dfef8be2" (UID: "dd5429a1-b64f-4748-a7d2-3255dfef8be2"). InnerVolumeSpecName "kube-api-access-vrm7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.872016 4744 generic.go:334] "Generic (PLEG): container finished" podID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerID="f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b" exitCode=0 Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.872069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerDied","Data":"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b"} Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.872085 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5rp" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.872098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5rp" event={"ID":"dd5429a1-b64f-4748-a7d2-3255dfef8be2","Type":"ContainerDied","Data":"bc6797a56a200c1d91e1bf59f6572e52fd68a4af129dbea23b8c17d7f047e2a9"} Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.872115 4744 scope.go:117] "RemoveContainer" containerID="f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.893300 4744 scope.go:117] "RemoveContainer" containerID="1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.895195 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd5429a1-b64f-4748-a7d2-3255dfef8be2" (UID: "dd5429a1-b64f-4748-a7d2-3255dfef8be2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.902270 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.902299 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrm7t\" (UniqueName: \"kubernetes.io/projected/dd5429a1-b64f-4748-a7d2-3255dfef8be2-kube-api-access-vrm7t\") on node \"crc\" DevicePath \"\"" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.902310 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd5429a1-b64f-4748-a7d2-3255dfef8be2-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.921309 4744 scope.go:117] "RemoveContainer" containerID="e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.954462 4744 scope.go:117] "RemoveContainer" containerID="f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b" Mar 11 01:33:08 crc kubenswrapper[4744]: E0311 01:33:08.955228 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b\": container with ID starting with f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b not found: ID does not exist" containerID="f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.955279 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b"} err="failed to get container status \"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b\": rpc error: code = NotFound desc = could not find container \"f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b\": container with ID starting with f8b41d433456520a78db3106ba009daad53bd5160d2fa9ae86332e98f09d0d4b not found: ID does not exist" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.955309 4744 scope.go:117] "RemoveContainer" containerID="1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4" Mar 11 01:33:08 crc kubenswrapper[4744]: E0311 01:33:08.955804 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4\": container with ID starting with 1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4 not found: ID does not exist" containerID="1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.955838 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4"} err="failed to get container status \"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4\": rpc error: code = NotFound desc = could not find container \"1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4\": container with ID starting with 1013db6b8ba91f60e124d624079574799fe20c50d4495f162f37630312abdce4 not found: ID does not exist" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.955860 4744 scope.go:117] "RemoveContainer" containerID="e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624" Mar 11 01:33:08 crc kubenswrapper[4744]: E0311 01:33:08.956282 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624\": container with ID starting with e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624 not found: ID does not exist" containerID="e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624" Mar 11 01:33:08 crc kubenswrapper[4744]: I0311 01:33:08.956318 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624"} err="failed to get container status \"e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624\": rpc error: code = NotFound desc = could not find container \"e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624\": container with ID starting with e4b0be8a856054857d81bb12a87e9feeb015074c9eddfd885d544cff7b673624 not found: ID does not exist" Mar 11 01:33:09 crc kubenswrapper[4744]: I0311 01:33:09.227698 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:33:09 crc kubenswrapper[4744]: I0311 01:33:09.230757 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sb5rp"] Mar 11 01:33:09 crc kubenswrapper[4744]: I0311 01:33:09.987904 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" path="/var/lib/kubelet/pods/dd5429a1-b64f-4748-a7d2-3255dfef8be2/volumes" Mar 11 01:33:21 crc kubenswrapper[4744]: I0311 01:33:21.975495 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:33:21 crc kubenswrapper[4744]: E0311 01:33:21.976792 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:33:34 crc kubenswrapper[4744]: I0311 01:33:34.975248 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:33:34 crc kubenswrapper[4744]: E0311 01:33:34.976718 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:33:49 crc kubenswrapper[4744]: I0311 01:33:49.974887 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:33:49 crc kubenswrapper[4744]: E0311 01:33:49.975858 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.163970 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553214-gz5hd"] Mar 11 01:34:00 crc kubenswrapper[4744]: E0311 01:34:00.165019 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="registry-server" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.165050 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="registry-server" Mar 11 01:34:00 crc kubenswrapper[4744]: E0311 01:34:00.165087 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="extract-utilities" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.165105 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="extract-utilities" Mar 11 01:34:00 crc kubenswrapper[4744]: E0311 01:34:00.165137 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="extract-content" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.165156 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="extract-content" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.165441 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5429a1-b64f-4748-a7d2-3255dfef8be2" containerName="registry-server" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.166874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.171726 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.172426 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.178736 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.184103 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553214-gz5hd"] Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.313060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6rq\" (UniqueName: \"kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq\") pod \"auto-csr-approver-29553214-gz5hd\" (UID: \"62a087e5-17fb-4cea-97b1-85438d94467d\") " pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.414349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6rq\" (UniqueName: \"kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq\") pod \"auto-csr-approver-29553214-gz5hd\" (UID: \"62a087e5-17fb-4cea-97b1-85438d94467d\") " pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.455807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6rq\" (UniqueName: \"kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq\") pod \"auto-csr-approver-29553214-gz5hd\" (UID: \"62a087e5-17fb-4cea-97b1-85438d94467d\") " pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:00 crc kubenswrapper[4744]: I0311 01:34:00.498661 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:01 crc kubenswrapper[4744]: I0311 01:34:01.004815 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553214-gz5hd"] Mar 11 01:34:01 crc kubenswrapper[4744]: W0311 01:34:01.009749 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a087e5_17fb_4cea_97b1_85438d94467d.slice/crio-115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea WatchSource:0}: Error finding container 115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea: Status 404 returned error can't find the container with id 115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea Mar 11 01:34:01 crc kubenswrapper[4744]: I0311 01:34:01.346244 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" event={"ID":"62a087e5-17fb-4cea-97b1-85438d94467d","Type":"ContainerStarted","Data":"115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea"} Mar 11 01:34:02 crc kubenswrapper[4744]: I0311 01:34:02.357506 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" event={"ID":"62a087e5-17fb-4cea-97b1-85438d94467d","Type":"ContainerStarted","Data":"51fc74673b23717f2a6178321aad21013fd7149b90f411723b1fee62d4ba20fc"} Mar 11 01:34:02 crc kubenswrapper[4744]: I0311 01:34:02.380951 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" podStartSLOduration=1.42480081 podStartE2EDuration="2.380924618s" podCreationTimestamp="2026-03-11 01:34:00 +0000 UTC" firstStartedPulling="2026-03-11 01:34:01.013941357 +0000 UTC m=+2397.818158992" lastFinishedPulling="2026-03-11 01:34:01.970065155 +0000 UTC m=+2398.774282800" observedRunningTime="2026-03-11 01:34:02.375039866 +0000 UTC m=+2399.179257511" watchObservedRunningTime="2026-03-11 01:34:02.380924618 +0000 UTC m=+2399.185142263" Mar 11 01:34:02 crc kubenswrapper[4744]: I0311 01:34:02.975117 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:34:02 crc kubenswrapper[4744]: E0311 01:34:02.975582 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:34:03 crc kubenswrapper[4744]: I0311 01:34:03.392449 4744 generic.go:334] "Generic (PLEG): container finished" podID="62a087e5-17fb-4cea-97b1-85438d94467d" containerID="51fc74673b23717f2a6178321aad21013fd7149b90f411723b1fee62d4ba20fc" exitCode=0 Mar 11 01:34:03 crc kubenswrapper[4744]: I0311 01:34:03.392496 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" event={"ID":"62a087e5-17fb-4cea-97b1-85438d94467d","Type":"ContainerDied","Data":"51fc74673b23717f2a6178321aad21013fd7149b90f411723b1fee62d4ba20fc"} Mar 11 01:34:04 crc kubenswrapper[4744]: I0311 01:34:04.809370 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:04 crc kubenswrapper[4744]: I0311 01:34:04.995554 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6rq\" (UniqueName: \"kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq\") pod \"62a087e5-17fb-4cea-97b1-85438d94467d\" (UID: \"62a087e5-17fb-4cea-97b1-85438d94467d\") " Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.003804 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq" (OuterVolumeSpecName: "kube-api-access-pd6rq") pod "62a087e5-17fb-4cea-97b1-85438d94467d" (UID: "62a087e5-17fb-4cea-97b1-85438d94467d"). InnerVolumeSpecName "kube-api-access-pd6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.098291 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6rq\" (UniqueName: \"kubernetes.io/projected/62a087e5-17fb-4cea-97b1-85438d94467d-kube-api-access-pd6rq\") on node \"crc\" DevicePath \"\"" Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.428890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" event={"ID":"62a087e5-17fb-4cea-97b1-85438d94467d","Type":"ContainerDied","Data":"115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea"} Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.428967 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115778fc5ddf7555b1c6707cbcdfaa60a81e840bc088db951f43de838f991cea" Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.429030 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553214-gz5hd" Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.471042 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553208-djwwv"] Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.486357 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553208-djwwv"] Mar 11 01:34:05 crc kubenswrapper[4744]: I0311 01:34:05.991041 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ee6953-f4fc-4b81-9141-736585fb3c37" path="/var/lib/kubelet/pods/16ee6953-f4fc-4b81-9141-736585fb3c37/volumes" Mar 11 01:34:11 crc kubenswrapper[4744]: I0311 01:34:11.816468 4744 scope.go:117] "RemoveContainer" containerID="d395d4fdf3fd77e7ea06c8e82cf9f9f286bbcf698facf4a1ddcb74769673d81f" Mar 11 01:34:13 crc kubenswrapper[4744]: I0311 01:34:13.985185 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:34:13 crc kubenswrapper[4744]: E0311 01:34:13.986172 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:34:25 crc kubenswrapper[4744]: I0311 01:34:25.975171 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:34:25 crc kubenswrapper[4744]: E0311 01:34:25.976475 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:34:37 crc kubenswrapper[4744]: I0311 01:34:37.975963 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:34:37 crc kubenswrapper[4744]: E0311 01:34:37.976986 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:34:48 crc kubenswrapper[4744]: I0311 01:34:48.974345 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:34:48 crc kubenswrapper[4744]: E0311 01:34:48.975041 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:35:00 crc kubenswrapper[4744]: I0311 01:35:00.975759 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:35:00 crc kubenswrapper[4744]: E0311 01:35:00.976771 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:35:11 crc kubenswrapper[4744]: I0311 01:35:11.975094 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:35:11 crc kubenswrapper[4744]: E0311 01:35:11.976191 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:35:22 crc kubenswrapper[4744]: I0311 01:35:22.975131 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:35:22 crc kubenswrapper[4744]: E0311 01:35:22.975700 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:35:35 crc kubenswrapper[4744]: I0311 01:35:35.975330 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:35:35 crc kubenswrapper[4744]: E0311 01:35:35.976424 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:35:49 crc kubenswrapper[4744]: I0311 01:35:49.975548 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:35:49 crc kubenswrapper[4744]: E0311 01:35:49.976610 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.170150 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553216-vrss5"] Mar 11 01:36:00 crc kubenswrapper[4744]: E0311 01:36:00.171739 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a087e5-17fb-4cea-97b1-85438d94467d" containerName="oc" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.171768 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a087e5-17fb-4cea-97b1-85438d94467d" containerName="oc" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.172244 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a087e5-17fb-4cea-97b1-85438d94467d" containerName="oc" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.173411 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.176507 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.177781 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553216-vrss5"] Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.178704 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.180639 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.338635 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbg4\" (UniqueName: \"kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4\") pod \"auto-csr-approver-29553216-vrss5\" (UID: \"ca000869-6b1d-4d16-93e0-c5e33148c9bd\") " pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.440651 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbg4\" (UniqueName: \"kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4\") pod \"auto-csr-approver-29553216-vrss5\" (UID: \"ca000869-6b1d-4d16-93e0-c5e33148c9bd\") " pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.475966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbg4\" (UniqueName: \"kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4\") pod \"auto-csr-approver-29553216-vrss5\" (UID: \"ca000869-6b1d-4d16-93e0-c5e33148c9bd\") " pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.511073 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:00 crc kubenswrapper[4744]: I0311 01:36:00.794046 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553216-vrss5"] Mar 11 01:36:00 crc kubenswrapper[4744]: W0311 01:36:00.805816 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca000869_6b1d_4d16_93e0_c5e33148c9bd.slice/crio-db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265 WatchSource:0}: Error finding container db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265: Status 404 returned error can't find the container with id db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265 Mar 11 01:36:01 crc kubenswrapper[4744]: I0311 01:36:01.725897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553216-vrss5" event={"ID":"ca000869-6b1d-4d16-93e0-c5e33148c9bd","Type":"ContainerStarted","Data":"db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265"} Mar 11 01:36:02 crc kubenswrapper[4744]: I0311 01:36:02.737993 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca000869-6b1d-4d16-93e0-c5e33148c9bd" containerID="b6f0e83b8a71c3b89a6c17e9e36032dc285887855b1d3aa8361c6b3525946c3e" exitCode=0 Mar 11 01:36:02 crc kubenswrapper[4744]: I0311 01:36:02.738058 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553216-vrss5" event={"ID":"ca000869-6b1d-4d16-93e0-c5e33148c9bd","Type":"ContainerDied","Data":"b6f0e83b8a71c3b89a6c17e9e36032dc285887855b1d3aa8361c6b3525946c3e"} Mar 11 01:36:02 crc kubenswrapper[4744]: I0311 01:36:02.975955 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:36:02 crc kubenswrapper[4744]: E0311 01:36:02.976330 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.086474 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.201599 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxbg4\" (UniqueName: \"kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4\") pod \"ca000869-6b1d-4d16-93e0-c5e33148c9bd\" (UID: \"ca000869-6b1d-4d16-93e0-c5e33148c9bd\") " Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.208582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4" (OuterVolumeSpecName: "kube-api-access-sxbg4") pod "ca000869-6b1d-4d16-93e0-c5e33148c9bd" (UID: "ca000869-6b1d-4d16-93e0-c5e33148c9bd"). InnerVolumeSpecName "kube-api-access-sxbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.303605 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxbg4\" (UniqueName: \"kubernetes.io/projected/ca000869-6b1d-4d16-93e0-c5e33148c9bd-kube-api-access-sxbg4\") on node \"crc\" DevicePath \"\"" Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.754372 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553216-vrss5" event={"ID":"ca000869-6b1d-4d16-93e0-c5e33148c9bd","Type":"ContainerDied","Data":"db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265"} Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.754759 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db54e72bd1fb2357c141b85a818245359d822db6ef140a7667a3ff3d66eaf265" Mar 11 01:36:04 crc kubenswrapper[4744]: I0311 01:36:04.754410 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553216-vrss5" Mar 11 01:36:05 crc kubenswrapper[4744]: I0311 01:36:05.154311 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553210-4vgls"] Mar 11 01:36:05 crc kubenswrapper[4744]: I0311 01:36:05.165214 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553210-4vgls"] Mar 11 01:36:05 crc kubenswrapper[4744]: I0311 01:36:05.984592 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4c219b-834d-47cd-97aa-cf789dd2b8ce" path="/var/lib/kubelet/pods/3d4c219b-834d-47cd-97aa-cf789dd2b8ce/volumes" Mar 11 01:36:11 crc kubenswrapper[4744]: I0311 01:36:11.922649 4744 scope.go:117] "RemoveContainer" containerID="6237930b99e4514f1ce9b020264cf7e1ba6c76e65bf7a49d064b5217917812bf" Mar 11 01:36:14 crc kubenswrapper[4744]: I0311 01:36:14.975009 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:36:14 crc kubenswrapper[4744]: E0311 01:36:14.976276 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:36:27 crc kubenswrapper[4744]: I0311 01:36:27.975903 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:36:27 crc kubenswrapper[4744]: E0311 01:36:27.976372 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:36:41 crc kubenswrapper[4744]: I0311 01:36:41.975319 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:36:41 crc kubenswrapper[4744]: E0311 01:36:41.976292 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:36:52 crc kubenswrapper[4744]: I0311 01:36:52.974818 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:36:53 crc kubenswrapper[4744]: I0311 01:36:53.201665 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695"} Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.171968 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553218-xvms7"] Mar 11 01:38:00 crc kubenswrapper[4744]: E0311 01:38:00.172858 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca000869-6b1d-4d16-93e0-c5e33148c9bd" containerName="oc" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.172874 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca000869-6b1d-4d16-93e0-c5e33148c9bd" containerName="oc" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.173058 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca000869-6b1d-4d16-93e0-c5e33148c9bd" containerName="oc" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.173666 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.176901 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.177035 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.177985 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.193010 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553218-xvms7"] Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.238296 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8jj\" (UniqueName: \"kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj\") pod \"auto-csr-approver-29553218-xvms7\" (UID: \"27a28442-2555-46ff-9a86-a8fe2e583e5e\") " pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.340400 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8jj\" (UniqueName: \"kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj\") pod \"auto-csr-approver-29553218-xvms7\" (UID: \"27a28442-2555-46ff-9a86-a8fe2e583e5e\") " pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.364487 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8jj\" (UniqueName: \"kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj\") pod \"auto-csr-approver-29553218-xvms7\" (UID: \"27a28442-2555-46ff-9a86-a8fe2e583e5e\") " pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.497273 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.775999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553218-xvms7"] Mar 11 01:38:00 crc kubenswrapper[4744]: W0311 01:38:00.784602 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27a28442_2555_46ff_9a86_a8fe2e583e5e.slice/crio-8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16 WatchSource:0}: Error finding container 8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16: Status 404 returned error can't find the container with id 8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16 Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.787543 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:38:00 crc kubenswrapper[4744]: I0311 01:38:00.815668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553218-xvms7" event={"ID":"27a28442-2555-46ff-9a86-a8fe2e583e5e","Type":"ContainerStarted","Data":"8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16"} Mar 11 01:38:02 crc kubenswrapper[4744]: I0311 01:38:02.836864 4744 generic.go:334] "Generic (PLEG): container finished" podID="27a28442-2555-46ff-9a86-a8fe2e583e5e" containerID="d78a15f9940a6fd40154c6e1f66ce64b190f37f3669b4e84772a754903a34f60" exitCode=0 Mar 11 01:38:02 crc kubenswrapper[4744]: I0311 01:38:02.837000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553218-xvms7" event={"ID":"27a28442-2555-46ff-9a86-a8fe2e583e5e","Type":"ContainerDied","Data":"d78a15f9940a6fd40154c6e1f66ce64b190f37f3669b4e84772a754903a34f60"} Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.151987 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.204614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr8jj\" (UniqueName: \"kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj\") pod \"27a28442-2555-46ff-9a86-a8fe2e583e5e\" (UID: \"27a28442-2555-46ff-9a86-a8fe2e583e5e\") " Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.214118 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj" (OuterVolumeSpecName: "kube-api-access-kr8jj") pod "27a28442-2555-46ff-9a86-a8fe2e583e5e" (UID: "27a28442-2555-46ff-9a86-a8fe2e583e5e"). InnerVolumeSpecName "kube-api-access-kr8jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.307271 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr8jj\" (UniqueName: \"kubernetes.io/projected/27a28442-2555-46ff-9a86-a8fe2e583e5e-kube-api-access-kr8jj\") on node \"crc\" DevicePath \"\"" Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.852838 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553218-xvms7" event={"ID":"27a28442-2555-46ff-9a86-a8fe2e583e5e","Type":"ContainerDied","Data":"8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16"} Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.852877 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f9a3f012c91e3bbd998c4714758f1d44dcd6d411a9db427428fb472f09d0a16" Mar 11 01:38:04 crc kubenswrapper[4744]: I0311 01:38:04.852885 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553218-xvms7" Mar 11 01:38:05 crc kubenswrapper[4744]: I0311 01:38:05.241678 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553212-7745p"] Mar 11 01:38:05 crc kubenswrapper[4744]: I0311 01:38:05.249256 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553212-7745p"] Mar 11 01:38:05 crc kubenswrapper[4744]: I0311 01:38:05.985676 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d898cd4-7543-4bff-b1e3-09ec55c8f223" path="/var/lib/kubelet/pods/8d898cd4-7543-4bff-b1e3-09ec55c8f223/volumes" Mar 11 01:38:12 crc kubenswrapper[4744]: I0311 01:38:12.027430 4744 scope.go:117] "RemoveContainer" containerID="e017e42edc068278bde4a088efa25c5ed3e0a76d2c1ff8c5c049bb90b9558345" Mar 11 01:39:12 crc kubenswrapper[4744]: I0311 01:39:12.409146 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:39:12 crc kubenswrapper[4744]: I0311 01:39:12.409872 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:39:42 crc kubenswrapper[4744]: I0311 01:39:42.409345 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:39:42 crc kubenswrapper[4744]: I0311 01:39:42.410100 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.169925 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553220-8ztsx"] Mar 11 01:40:00 crc kubenswrapper[4744]: E0311 01:40:00.170757 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a28442-2555-46ff-9a86-a8fe2e583e5e" containerName="oc" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.170771 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a28442-2555-46ff-9a86-a8fe2e583e5e" containerName="oc" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.170948 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a28442-2555-46ff-9a86-a8fe2e583e5e" containerName="oc" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.171445 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.174894 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.175631 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.179982 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.188823 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553220-8ztsx"] Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.282569 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jg7m\" (UniqueName: \"kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m\") pod \"auto-csr-approver-29553220-8ztsx\" (UID: \"14492823-13cb-4a93-aa2c-e58f5e612098\") " pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.384393 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jg7m\" (UniqueName: \"kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m\") pod \"auto-csr-approver-29553220-8ztsx\" (UID: \"14492823-13cb-4a93-aa2c-e58f5e612098\") " pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.411676 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jg7m\" (UniqueName: \"kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m\") pod \"auto-csr-approver-29553220-8ztsx\" (UID: \"14492823-13cb-4a93-aa2c-e58f5e612098\") " pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.532983 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:00 crc kubenswrapper[4744]: I0311 01:40:00.799846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553220-8ztsx"] Mar 11 01:40:00 crc kubenswrapper[4744]: W0311 01:40:00.816642 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14492823_13cb_4a93_aa2c_e58f5e612098.slice/crio-c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6 WatchSource:0}: Error finding container c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6: Status 404 returned error can't find the container with id c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6 Mar 11 01:40:01 crc kubenswrapper[4744]: I0311 01:40:01.182620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" event={"ID":"14492823-13cb-4a93-aa2c-e58f5e612098","Type":"ContainerStarted","Data":"c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6"} Mar 11 01:40:03 crc kubenswrapper[4744]: I0311 01:40:03.208237 4744 generic.go:334] "Generic (PLEG): container finished" podID="14492823-13cb-4a93-aa2c-e58f5e612098" containerID="e486597083f1c8cefd8f0269e7c831f2670b449d8fd439360ac49a3bd8a67c4b" exitCode=0 Mar 11 01:40:03 crc kubenswrapper[4744]: I0311 01:40:03.208327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" event={"ID":"14492823-13cb-4a93-aa2c-e58f5e612098","Type":"ContainerDied","Data":"e486597083f1c8cefd8f0269e7c831f2670b449d8fd439360ac49a3bd8a67c4b"} Mar 11 01:40:04 crc kubenswrapper[4744]: I0311 01:40:04.587217 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:04 crc kubenswrapper[4744]: I0311 01:40:04.789989 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jg7m\" (UniqueName: \"kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m\") pod \"14492823-13cb-4a93-aa2c-e58f5e612098\" (UID: \"14492823-13cb-4a93-aa2c-e58f5e612098\") " Mar 11 01:40:04 crc kubenswrapper[4744]: I0311 01:40:04.811651 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m" (OuterVolumeSpecName: "kube-api-access-7jg7m") pod "14492823-13cb-4a93-aa2c-e58f5e612098" (UID: "14492823-13cb-4a93-aa2c-e58f5e612098"). InnerVolumeSpecName "kube-api-access-7jg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:40:04 crc kubenswrapper[4744]: I0311 01:40:04.892254 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jg7m\" (UniqueName: \"kubernetes.io/projected/14492823-13cb-4a93-aa2c-e58f5e612098-kube-api-access-7jg7m\") on node \"crc\" DevicePath \"\"" Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.228290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" event={"ID":"14492823-13cb-4a93-aa2c-e58f5e612098","Type":"ContainerDied","Data":"c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6"} Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.228346 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c302d6bb8dfe322b9e12b735bde6aca5657570fb87b4a4580676cb2d1f2590f6" Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.228891 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553220-8ztsx" Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.678443 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553214-gz5hd"] Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.689020 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553214-gz5hd"] Mar 11 01:40:05 crc kubenswrapper[4744]: I0311 01:40:05.991543 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a087e5-17fb-4cea-97b1-85438d94467d" path="/var/lib/kubelet/pods/62a087e5-17fb-4cea-97b1-85438d94467d/volumes" Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.113706 4744 scope.go:117] "RemoveContainer" containerID="51fc74673b23717f2a6178321aad21013fd7149b90f411723b1fee62d4ba20fc" Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.409322 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.409401 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.409462 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.410637 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:40:12 crc kubenswrapper[4744]: I0311 01:40:12.410796 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695" gracePeriod=600 Mar 11 01:40:13 crc kubenswrapper[4744]: I0311 01:40:13.300842 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695" exitCode=0 Mar 11 01:40:13 crc kubenswrapper[4744]: I0311 01:40:13.300919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695"} Mar 11 01:40:13 crc kubenswrapper[4744]: I0311 01:40:13.301681 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1"} Mar 11 01:40:13 crc kubenswrapper[4744]: I0311 01:40:13.301702 4744 scope.go:117] "RemoveContainer" containerID="f347064f0cb75f8bb43f62de0ab1edab4843ef09d878578850df37cec024ff09" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.441799 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:23 crc kubenswrapper[4744]: E0311 01:41:23.443018 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14492823-13cb-4a93-aa2c-e58f5e612098" containerName="oc" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.443048 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="14492823-13cb-4a93-aa2c-e58f5e612098" containerName="oc" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.444857 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="14492823-13cb-4a93-aa2c-e58f5e612098" containerName="oc" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.446873 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.469626 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.505857 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2rwp\" (UniqueName: \"kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.505938 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.506297 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.607877 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.607975 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2rwp\" (UniqueName: \"kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.608056 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.608828 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.608910 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.635958 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2rwp\" (UniqueName: \"kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp\") pod \"redhat-operators-l2684\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:23 crc kubenswrapper[4744]: I0311 01:41:23.780054 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:24 crc kubenswrapper[4744]: I0311 01:41:24.216304 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:24 crc kubenswrapper[4744]: I0311 01:41:24.966682 4744 generic.go:334] "Generic (PLEG): container finished" podID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerID="7f7e38f7d6e379e5c43f5a4515dd7a924d6e7275cd09ad3113dd9f1974425c9b" exitCode=0 Mar 11 01:41:24 crc kubenswrapper[4744]: I0311 01:41:24.966740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerDied","Data":"7f7e38f7d6e379e5c43f5a4515dd7a924d6e7275cd09ad3113dd9f1974425c9b"} Mar 11 01:41:24 crc kubenswrapper[4744]: I0311 01:41:24.967700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerStarted","Data":"b4dadeefdf0ed5785d4f56e115e40335c70dc3325b2fffa759771e60cba78192"} Mar 11 01:41:25 crc kubenswrapper[4744]: I0311 01:41:25.988017 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerStarted","Data":"154b51deeafa5b657c89f5d0fbb7c53162316cf1406f08432d6abc84b4224418"} Mar 11 01:41:26 crc kubenswrapper[4744]: I0311 01:41:26.996337 4744 generic.go:334] "Generic (PLEG): container finished" podID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerID="154b51deeafa5b657c89f5d0fbb7c53162316cf1406f08432d6abc84b4224418" exitCode=0 Mar 11 01:41:26 crc kubenswrapper[4744]: I0311 01:41:26.996428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerDied","Data":"154b51deeafa5b657c89f5d0fbb7c53162316cf1406f08432d6abc84b4224418"} Mar 11 01:41:28 crc kubenswrapper[4744]: I0311 01:41:28.006548 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerStarted","Data":"8e99ccde3cf88165ebb8b1e1bcdd9c24f9d2c0f9cdd7621b48b6aab3983cce87"} Mar 11 01:41:28 crc kubenswrapper[4744]: I0311 01:41:28.028270 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2684" podStartSLOduration=2.578489858 podStartE2EDuration="5.028251995s" podCreationTimestamp="2026-03-11 01:41:23 +0000 UTC" firstStartedPulling="2026-03-11 01:41:24.968222725 +0000 UTC m=+2841.772440330" lastFinishedPulling="2026-03-11 01:41:27.417984822 +0000 UTC m=+2844.222202467" observedRunningTime="2026-03-11 01:41:28.026197259 +0000 UTC m=+2844.830414874" watchObservedRunningTime="2026-03-11 01:41:28.028251995 +0000 UTC m=+2844.832469600" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.070019 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.072211 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.083987 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.139205 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfph\" (UniqueName: \"kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.139266 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.139295 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.240472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.240750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfph\" (UniqueName: \"kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.241034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.241414 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.242037 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.267657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfph\" (UniqueName: \"kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph\") pod \"certified-operators-jb28v\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.402538 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:32 crc kubenswrapper[4744]: I0311 01:41:32.950423 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:33 crc kubenswrapper[4744]: I0311 01:41:33.053390 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerStarted","Data":"c2e147068089fda4c81eb43574ad298f906f832608c5d7a5a8be1a078d34ecb6"} Mar 11 01:41:33 crc kubenswrapper[4744]: I0311 01:41:33.780549 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:33 crc kubenswrapper[4744]: I0311 01:41:33.781678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:34 crc kubenswrapper[4744]: I0311 01:41:34.064606 4744 generic.go:334] "Generic (PLEG): container finished" podID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerID="d4a5b99030022f6d74a08d81df4d0a977f594dd8ec0eb7d3ac5df3a5cb238117" exitCode=0 Mar 11 01:41:34 crc kubenswrapper[4744]: I0311 01:41:34.064680 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerDied","Data":"d4a5b99030022f6d74a08d81df4d0a977f594dd8ec0eb7d3ac5df3a5cb238117"} Mar 11 01:41:34 crc kubenswrapper[4744]: I0311 01:41:34.850182 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2684" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="registry-server" probeResult="failure" output=< Mar 11 01:41:34 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:41:34 crc kubenswrapper[4744]: > Mar 11 01:41:35 crc kubenswrapper[4744]: I0311 01:41:35.077755 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerStarted","Data":"ff913d98974a2dd155b163ac1a7141ecbc20ca651fd77211ec76d07b8b9887a5"} Mar 11 01:41:36 crc kubenswrapper[4744]: I0311 01:41:36.090759 4744 generic.go:334] "Generic (PLEG): container finished" podID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerID="ff913d98974a2dd155b163ac1a7141ecbc20ca651fd77211ec76d07b8b9887a5" exitCode=0 Mar 11 01:41:36 crc kubenswrapper[4744]: I0311 01:41:36.090833 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerDied","Data":"ff913d98974a2dd155b163ac1a7141ecbc20ca651fd77211ec76d07b8b9887a5"} Mar 11 01:41:37 crc kubenswrapper[4744]: I0311 01:41:37.104121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerStarted","Data":"cce16d092b64ed68ad557a37c1c31f9ed24fa379c7fa661bd6a7a67f3ef3ea5f"} Mar 11 01:41:37 crc kubenswrapper[4744]: I0311 01:41:37.132888 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jb28v" podStartSLOduration=2.601071407 podStartE2EDuration="5.132860541s" podCreationTimestamp="2026-03-11 01:41:32 +0000 UTC" firstStartedPulling="2026-03-11 01:41:34.067508314 +0000 UTC m=+2850.871725959" lastFinishedPulling="2026-03-11 01:41:36.599297448 +0000 UTC m=+2853.403515093" observedRunningTime="2026-03-11 01:41:37.131478518 +0000 UTC m=+2853.935696163" watchObservedRunningTime="2026-03-11 01:41:37.132860541 +0000 UTC m=+2853.937078186" Mar 11 01:41:42 crc kubenswrapper[4744]: I0311 01:41:42.403066 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:42 crc kubenswrapper[4744]: I0311 01:41:42.403746 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:42 crc kubenswrapper[4744]: I0311 01:41:42.489811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:43 crc kubenswrapper[4744]: I0311 01:41:43.206421 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:43 crc kubenswrapper[4744]: I0311 01:41:43.259031 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:43 crc kubenswrapper[4744]: I0311 01:41:43.845898 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:43 crc kubenswrapper[4744]: I0311 01:41:43.899354 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:45 crc kubenswrapper[4744]: I0311 01:41:45.143647 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:45 crc kubenswrapper[4744]: I0311 01:41:45.176110 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jb28v" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="registry-server" containerID="cri-o://cce16d092b64ed68ad557a37c1c31f9ed24fa379c7fa661bd6a7a67f3ef3ea5f" gracePeriod=2 Mar 11 01:41:45 crc kubenswrapper[4744]: I0311 01:41:45.176441 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2684" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="registry-server" containerID="cri-o://8e99ccde3cf88165ebb8b1e1bcdd9c24f9d2c0f9cdd7621b48b6aab3983cce87" gracePeriod=2 Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.187194 4744 generic.go:334] "Generic (PLEG): container finished" podID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerID="cce16d092b64ed68ad557a37c1c31f9ed24fa379c7fa661bd6a7a67f3ef3ea5f" exitCode=0 Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.187275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerDied","Data":"cce16d092b64ed68ad557a37c1c31f9ed24fa379c7fa661bd6a7a67f3ef3ea5f"} Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.191211 4744 generic.go:334] "Generic (PLEG): container finished" podID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerID="8e99ccde3cf88165ebb8b1e1bcdd9c24f9d2c0f9cdd7621b48b6aab3983cce87" exitCode=0 Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.191255 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerDied","Data":"8e99ccde3cf88165ebb8b1e1bcdd9c24f9d2c0f9cdd7621b48b6aab3983cce87"} Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.191290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2684" event={"ID":"b6e1c215-b989-4038-acfe-3f82018cfa03","Type":"ContainerDied","Data":"b4dadeefdf0ed5785d4f56e115e40335c70dc3325b2fffa759771e60cba78192"} Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.191309 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4dadeefdf0ed5785d4f56e115e40335c70dc3325b2fffa759771e60cba78192" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.213032 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.228871 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.394149 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hfph\" (UniqueName: \"kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph\") pod \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.394693 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content\") pod \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.394938 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2rwp\" (UniqueName: \"kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp\") pod \"b6e1c215-b989-4038-acfe-3f82018cfa03\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.395187 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities\") pod \"b6e1c215-b989-4038-acfe-3f82018cfa03\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.395417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities\") pod \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\" (UID: \"b0a52ecd-2222-4a4f-a5f5-34505a89cff0\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.395703 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content\") pod \"b6e1c215-b989-4038-acfe-3f82018cfa03\" (UID: \"b6e1c215-b989-4038-acfe-3f82018cfa03\") " Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.396663 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities" (OuterVolumeSpecName: "utilities") pod "b0a52ecd-2222-4a4f-a5f5-34505a89cff0" (UID: "b0a52ecd-2222-4a4f-a5f5-34505a89cff0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.397423 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities" (OuterVolumeSpecName: "utilities") pod "b6e1c215-b989-4038-acfe-3f82018cfa03" (UID: "b6e1c215-b989-4038-acfe-3f82018cfa03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.402049 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph" (OuterVolumeSpecName: "kube-api-access-9hfph") pod "b0a52ecd-2222-4a4f-a5f5-34505a89cff0" (UID: "b0a52ecd-2222-4a4f-a5f5-34505a89cff0"). InnerVolumeSpecName "kube-api-access-9hfph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.406346 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp" (OuterVolumeSpecName: "kube-api-access-c2rwp") pod "b6e1c215-b989-4038-acfe-3f82018cfa03" (UID: "b6e1c215-b989-4038-acfe-3f82018cfa03"). InnerVolumeSpecName "kube-api-access-c2rwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.482703 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0a52ecd-2222-4a4f-a5f5-34505a89cff0" (UID: "b0a52ecd-2222-4a4f-a5f5-34505a89cff0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.497283 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hfph\" (UniqueName: \"kubernetes.io/projected/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-kube-api-access-9hfph\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.497312 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.497323 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2rwp\" (UniqueName: \"kubernetes.io/projected/b6e1c215-b989-4038-acfe-3f82018cfa03-kube-api-access-c2rwp\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.497333 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.497343 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a52ecd-2222-4a4f-a5f5-34505a89cff0-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.561241 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6e1c215-b989-4038-acfe-3f82018cfa03" (UID: "b6e1c215-b989-4038-acfe-3f82018cfa03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:41:46 crc kubenswrapper[4744]: I0311 01:41:46.598302 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e1c215-b989-4038-acfe-3f82018cfa03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.210528 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2684" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.210504 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jb28v" event={"ID":"b0a52ecd-2222-4a4f-a5f5-34505a89cff0","Type":"ContainerDied","Data":"c2e147068089fda4c81eb43574ad298f906f832608c5d7a5a8be1a078d34ecb6"} Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.210622 4744 scope.go:117] "RemoveContainer" containerID="cce16d092b64ed68ad557a37c1c31f9ed24fa379c7fa661bd6a7a67f3ef3ea5f" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.210539 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jb28v" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.239897 4744 scope.go:117] "RemoveContainer" containerID="ff913d98974a2dd155b163ac1a7141ecbc20ca651fd77211ec76d07b8b9887a5" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.287462 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.295255 4744 scope.go:117] "RemoveContainer" containerID="d4a5b99030022f6d74a08d81df4d0a977f594dd8ec0eb7d3ac5df3a5cb238117" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.296097 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jb28v"] Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.315235 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.326378 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2684"] Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.990345 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" path="/var/lib/kubelet/pods/b0a52ecd-2222-4a4f-a5f5-34505a89cff0/volumes" Mar 11 01:41:47 crc kubenswrapper[4744]: I0311 01:41:47.991702 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" path="/var/lib/kubelet/pods/b6e1c215-b989-4038-acfe-3f82018cfa03/volumes" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.661352 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665184 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665224 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665258 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665278 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665314 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="extract-content" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665333 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="extract-content" Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665370 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="extract-utilities" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665388 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="extract-utilities" Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665410 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="extract-content" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665450 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="extract-content" Mar 11 01:41:53 crc kubenswrapper[4744]: E0311 01:41:53.665479 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="extract-utilities" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665496 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="extract-utilities" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665868 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e1c215-b989-4038-acfe-3f82018cfa03" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.665946 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a52ecd-2222-4a4f-a5f5-34505a89cff0" containerName="registry-server" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.668410 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.676962 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.720198 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.720393 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nj5b\" (UniqueName: \"kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.720611 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.821798 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nj5b\" (UniqueName: \"kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.821895 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.821987 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.822764 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.822819 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:53 crc kubenswrapper[4744]: I0311 01:41:53.851959 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nj5b\" (UniqueName: \"kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b\") pod \"redhat-marketplace-cjkdf\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:54 crc kubenswrapper[4744]: I0311 01:41:54.011431 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:41:54 crc kubenswrapper[4744]: I0311 01:41:54.258406 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:41:54 crc kubenswrapper[4744]: W0311 01:41:54.270578 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98512824_027f_4750_9ab9_10ca0b2b598c.slice/crio-c82ff0d82c19d42b503d69511e7c7370959f25ed65f241ba8b972f14669d95f7 WatchSource:0}: Error finding container c82ff0d82c19d42b503d69511e7c7370959f25ed65f241ba8b972f14669d95f7: Status 404 returned error can't find the container with id c82ff0d82c19d42b503d69511e7c7370959f25ed65f241ba8b972f14669d95f7 Mar 11 01:41:54 crc kubenswrapper[4744]: I0311 01:41:54.290273 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerStarted","Data":"c82ff0d82c19d42b503d69511e7c7370959f25ed65f241ba8b972f14669d95f7"} Mar 11 01:41:55 crc kubenswrapper[4744]: I0311 01:41:55.302384 4744 generic.go:334] "Generic (PLEG): container finished" podID="98512824-027f-4750-9ab9-10ca0b2b598c" containerID="c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c" exitCode=0 Mar 11 01:41:55 crc kubenswrapper[4744]: I0311 01:41:55.302473 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerDied","Data":"c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c"} Mar 11 01:41:57 crc kubenswrapper[4744]: I0311 01:41:57.324296 4744 generic.go:334] "Generic (PLEG): container finished" podID="98512824-027f-4750-9ab9-10ca0b2b598c" containerID="fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee" exitCode=0 Mar 11 01:41:57 crc kubenswrapper[4744]: I0311 01:41:57.324626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerDied","Data":"fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee"} Mar 11 01:41:58 crc kubenswrapper[4744]: I0311 01:41:58.339733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerStarted","Data":"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113"} Mar 11 01:41:58 crc kubenswrapper[4744]: I0311 01:41:58.369053 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cjkdf" podStartSLOduration=2.837923939 podStartE2EDuration="5.36902384s" podCreationTimestamp="2026-03-11 01:41:53 +0000 UTC" firstStartedPulling="2026-03-11 01:41:55.304561141 +0000 UTC m=+2872.108778746" lastFinishedPulling="2026-03-11 01:41:57.835661002 +0000 UTC m=+2874.639878647" observedRunningTime="2026-03-11 01:41:58.361047098 +0000 UTC m=+2875.165264743" watchObservedRunningTime="2026-03-11 01:41:58.36902384 +0000 UTC m=+2875.173241455" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.154560 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553222-sr74p"] Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.155826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.158383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.158998 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.159435 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.164723 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553222-sr74p"] Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.221651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55hdv\" (UniqueName: \"kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv\") pod \"auto-csr-approver-29553222-sr74p\" (UID: \"0888068d-7429-4c76-8a78-5c994a15b419\") " pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.323153 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55hdv\" (UniqueName: \"kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv\") pod \"auto-csr-approver-29553222-sr74p\" (UID: \"0888068d-7429-4c76-8a78-5c994a15b419\") " pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.362420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55hdv\" (UniqueName: \"kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv\") pod \"auto-csr-approver-29553222-sr74p\" (UID: \"0888068d-7429-4c76-8a78-5c994a15b419\") " pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.473922 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:00 crc kubenswrapper[4744]: I0311 01:42:00.914103 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553222-sr74p"] Mar 11 01:42:01 crc kubenswrapper[4744]: I0311 01:42:01.373813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553222-sr74p" event={"ID":"0888068d-7429-4c76-8a78-5c994a15b419","Type":"ContainerStarted","Data":"d45cfbc1b8250a42624785659273cd758a3459092e89837e43c8c83f2e205611"} Mar 11 01:42:02 crc kubenswrapper[4744]: I0311 01:42:02.386147 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553222-sr74p" event={"ID":"0888068d-7429-4c76-8a78-5c994a15b419","Type":"ContainerStarted","Data":"f8a9c8debdfdd78ef66cdc39d479bf69b6038c07bc3106f6f257fa3d09628f65"} Mar 11 01:42:02 crc kubenswrapper[4744]: I0311 01:42:02.409065 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553222-sr74p" podStartSLOduration=1.323989583 podStartE2EDuration="2.409039688s" podCreationTimestamp="2026-03-11 01:42:00 +0000 UTC" firstStartedPulling="2026-03-11 01:42:00.933891391 +0000 UTC m=+2877.738109036" lastFinishedPulling="2026-03-11 01:42:02.018941526 +0000 UTC m=+2878.823159141" observedRunningTime="2026-03-11 01:42:02.40371921 +0000 UTC m=+2879.207936825" watchObservedRunningTime="2026-03-11 01:42:02.409039688 +0000 UTC m=+2879.213257283" Mar 11 01:42:03 crc kubenswrapper[4744]: I0311 01:42:03.393602 4744 generic.go:334] "Generic (PLEG): container finished" podID="0888068d-7429-4c76-8a78-5c994a15b419" containerID="f8a9c8debdfdd78ef66cdc39d479bf69b6038c07bc3106f6f257fa3d09628f65" exitCode=0 Mar 11 01:42:03 crc kubenswrapper[4744]: I0311 01:42:03.393639 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553222-sr74p" event={"ID":"0888068d-7429-4c76-8a78-5c994a15b419","Type":"ContainerDied","Data":"f8a9c8debdfdd78ef66cdc39d479bf69b6038c07bc3106f6f257fa3d09628f65"} Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.012369 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.014292 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.087490 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.473531 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.546931 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:42:04 crc kubenswrapper[4744]: I0311 01:42:04.845913 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.009998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55hdv\" (UniqueName: \"kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv\") pod \"0888068d-7429-4c76-8a78-5c994a15b419\" (UID: \"0888068d-7429-4c76-8a78-5c994a15b419\") " Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.021445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv" (OuterVolumeSpecName: "kube-api-access-55hdv") pod "0888068d-7429-4c76-8a78-5c994a15b419" (UID: "0888068d-7429-4c76-8a78-5c994a15b419"). InnerVolumeSpecName "kube-api-access-55hdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.112653 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55hdv\" (UniqueName: \"kubernetes.io/projected/0888068d-7429-4c76-8a78-5c994a15b419-kube-api-access-55hdv\") on node \"crc\" DevicePath \"\"" Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.416782 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553222-sr74p" event={"ID":"0888068d-7429-4c76-8a78-5c994a15b419","Type":"ContainerDied","Data":"d45cfbc1b8250a42624785659273cd758a3459092e89837e43c8c83f2e205611"} Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.416831 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45cfbc1b8250a42624785659273cd758a3459092e89837e43c8c83f2e205611" Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.416801 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553222-sr74p" Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.495792 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553216-vrss5"] Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.501987 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553216-vrss5"] Mar 11 01:42:05 crc kubenswrapper[4744]: I0311 01:42:05.995577 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca000869-6b1d-4d16-93e0-c5e33148c9bd" path="/var/lib/kubelet/pods/ca000869-6b1d-4d16-93e0-c5e33148c9bd/volumes" Mar 11 01:42:06 crc kubenswrapper[4744]: I0311 01:42:06.428280 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cjkdf" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="registry-server" containerID="cri-o://fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113" gracePeriod=2 Mar 11 01:42:06 crc kubenswrapper[4744]: I0311 01:42:06.907066 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.045082 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nj5b\" (UniqueName: \"kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b\") pod \"98512824-027f-4750-9ab9-10ca0b2b598c\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.045569 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content\") pod \"98512824-027f-4750-9ab9-10ca0b2b598c\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.045627 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities\") pod \"98512824-027f-4750-9ab9-10ca0b2b598c\" (UID: \"98512824-027f-4750-9ab9-10ca0b2b598c\") " Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.047760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities" (OuterVolumeSpecName: "utilities") pod "98512824-027f-4750-9ab9-10ca0b2b598c" (UID: "98512824-027f-4750-9ab9-10ca0b2b598c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.056008 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b" (OuterVolumeSpecName: "kube-api-access-7nj5b") pod "98512824-027f-4750-9ab9-10ca0b2b598c" (UID: "98512824-027f-4750-9ab9-10ca0b2b598c"). InnerVolumeSpecName "kube-api-access-7nj5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.147747 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nj5b\" (UniqueName: \"kubernetes.io/projected/98512824-027f-4750-9ab9-10ca0b2b598c-kube-api-access-7nj5b\") on node \"crc\" DevicePath \"\"" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.147810 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.286889 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98512824-027f-4750-9ab9-10ca0b2b598c" (UID: "98512824-027f-4750-9ab9-10ca0b2b598c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.351304 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98512824-027f-4750-9ab9-10ca0b2b598c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.436374 4744 generic.go:334] "Generic (PLEG): container finished" podID="98512824-027f-4750-9ab9-10ca0b2b598c" containerID="fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113" exitCode=0 Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.436418 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerDied","Data":"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113"} Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.436449 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cjkdf" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.436467 4744 scope.go:117] "RemoveContainer" containerID="fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.436456 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cjkdf" event={"ID":"98512824-027f-4750-9ab9-10ca0b2b598c","Type":"ContainerDied","Data":"c82ff0d82c19d42b503d69511e7c7370959f25ed65f241ba8b972f14669d95f7"} Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.466028 4744 scope.go:117] "RemoveContainer" containerID="fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.478084 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.482718 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cjkdf"] Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.498218 4744 scope.go:117] "RemoveContainer" containerID="c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.513074 4744 scope.go:117] "RemoveContainer" containerID="fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113" Mar 11 01:42:07 crc kubenswrapper[4744]: E0311 01:42:07.513450 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113\": container with ID starting with fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113 not found: ID does not exist" containerID="fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.513479 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113"} err="failed to get container status \"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113\": rpc error: code = NotFound desc = could not find container \"fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113\": container with ID starting with fe9839e4a266c717b8985d2d29e2163b5e5557d9faacbb96ea7b07abf50f8113 not found: ID does not exist" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.513498 4744 scope.go:117] "RemoveContainer" containerID="fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee" Mar 11 01:42:07 crc kubenswrapper[4744]: E0311 01:42:07.513765 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee\": container with ID starting with fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee not found: ID does not exist" containerID="fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.513811 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee"} err="failed to get container status \"fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee\": rpc error: code = NotFound desc = could not find container \"fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee\": container with ID starting with fc363515f6320ed8c62ef37d296c49a6f2728c39260d8a69ecf792cb0f392dee not found: ID does not exist" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.513843 4744 scope.go:117] "RemoveContainer" containerID="c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c" Mar 11 01:42:07 crc kubenswrapper[4744]: E0311 01:42:07.514122 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c\": container with ID starting with c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c not found: ID does not exist" containerID="c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c" Mar 11 01:42:07 crc kubenswrapper[4744]: I0311 01:42:07.514160 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c"} err="failed to get container status \"c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c\": rpc error: code = NotFound desc = could not find container \"c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c\": container with ID starting with c0f0d5efd579a0626e31ede7fbccc52769c2b6f172a42a45e06b6529b82f1d1c not found: ID does not exist" Mar 11 01:42:08 crc kubenswrapper[4744]: I0311 01:42:08.007708 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" path="/var/lib/kubelet/pods/98512824-027f-4750-9ab9-10ca0b2b598c/volumes" Mar 11 01:42:12 crc kubenswrapper[4744]: I0311 01:42:12.234419 4744 scope.go:117] "RemoveContainer" containerID="b6f0e83b8a71c3b89a6c17e9e36032dc285887855b1d3aa8361c6b3525946c3e" Mar 11 01:42:12 crc kubenswrapper[4744]: I0311 01:42:12.409355 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:42:12 crc kubenswrapper[4744]: I0311 01:42:12.409451 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:42:42 crc kubenswrapper[4744]: I0311 01:42:42.409245 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:42:42 crc kubenswrapper[4744]: I0311 01:42:42.409977 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:43:12 crc kubenswrapper[4744]: I0311 01:43:12.409155 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:43:12 crc kubenswrapper[4744]: I0311 01:43:12.409915 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:43:12 crc kubenswrapper[4744]: I0311 01:43:12.409980 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:43:12 crc kubenswrapper[4744]: I0311 01:43:12.410982 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:43:12 crc kubenswrapper[4744]: I0311 01:43:12.411080 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" gracePeriod=600 Mar 11 01:43:12 crc kubenswrapper[4744]: E0311 01:43:12.566739 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:43:13 crc kubenswrapper[4744]: I0311 01:43:13.019698 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" exitCode=0 Mar 11 01:43:13 crc kubenswrapper[4744]: I0311 01:43:13.019761 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1"} Mar 11 01:43:13 crc kubenswrapper[4744]: I0311 01:43:13.020195 4744 scope.go:117] "RemoveContainer" containerID="3ed3bf9772d622baceff46dfff72fd72ef269c51d1877ba2e252f727e4ae9695" Mar 11 01:43:13 crc kubenswrapper[4744]: I0311 01:43:13.020841 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:43:13 crc kubenswrapper[4744]: E0311 01:43:13.021240 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.124093 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:18 crc kubenswrapper[4744]: E0311 01:43:18.124953 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="extract-utilities" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.124979 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="extract-utilities" Mar 11 01:43:18 crc kubenswrapper[4744]: E0311 01:43:18.125015 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="extract-content" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.125028 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="extract-content" Mar 11 01:43:18 crc kubenswrapper[4744]: E0311 01:43:18.125060 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0888068d-7429-4c76-8a78-5c994a15b419" containerName="oc" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.125073 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0888068d-7429-4c76-8a78-5c994a15b419" containerName="oc" Mar 11 01:43:18 crc kubenswrapper[4744]: E0311 01:43:18.125087 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="registry-server" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.125099 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="registry-server" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.125335 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="98512824-027f-4750-9ab9-10ca0b2b598c" containerName="registry-server" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.125370 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0888068d-7429-4c76-8a78-5c994a15b419" containerName="oc" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.127024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.160210 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.305555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kf67\" (UniqueName: \"kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.305624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.305656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.406767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.406848 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.407009 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kf67\" (UniqueName: \"kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.407712 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.407904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.426443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kf67\" (UniqueName: \"kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67\") pod \"community-operators-mlwkf\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.458287 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:18 crc kubenswrapper[4744]: I0311 01:43:18.783399 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:19 crc kubenswrapper[4744]: I0311 01:43:19.085611 4744 generic.go:334] "Generic (PLEG): container finished" podID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerID="28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736" exitCode=0 Mar 11 01:43:19 crc kubenswrapper[4744]: I0311 01:43:19.085674 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerDied","Data":"28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736"} Mar 11 01:43:19 crc kubenswrapper[4744]: I0311 01:43:19.085888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerStarted","Data":"066ddfcd614305d8c39afac12b685fabbc331f0bb648d9753e0cc18ee41f4ec0"} Mar 11 01:43:19 crc kubenswrapper[4744]: I0311 01:43:19.087918 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:43:21 crc kubenswrapper[4744]: I0311 01:43:21.106303 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerStarted","Data":"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70"} Mar 11 01:43:22 crc kubenswrapper[4744]: I0311 01:43:22.114647 4744 generic.go:334] "Generic (PLEG): container finished" podID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerID="330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70" exitCode=0 Mar 11 01:43:22 crc kubenswrapper[4744]: I0311 01:43:22.114699 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerDied","Data":"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70"} Mar 11 01:43:23 crc kubenswrapper[4744]: I0311 01:43:23.129569 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerStarted","Data":"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05"} Mar 11 01:43:26 crc kubenswrapper[4744]: I0311 01:43:26.975785 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:43:26 crc kubenswrapper[4744]: E0311 01:43:26.976180 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:43:28 crc kubenswrapper[4744]: I0311 01:43:28.458673 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:28 crc kubenswrapper[4744]: I0311 01:43:28.458992 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:28 crc kubenswrapper[4744]: I0311 01:43:28.540836 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:28 crc kubenswrapper[4744]: I0311 01:43:28.572430 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mlwkf" podStartSLOduration=6.878734676 podStartE2EDuration="10.572405499s" podCreationTimestamp="2026-03-11 01:43:18 +0000 UTC" firstStartedPulling="2026-03-11 01:43:19.087531862 +0000 UTC m=+2955.891749507" lastFinishedPulling="2026-03-11 01:43:22.781202715 +0000 UTC m=+2959.585420330" observedRunningTime="2026-03-11 01:43:23.160476234 +0000 UTC m=+2959.964693849" watchObservedRunningTime="2026-03-11 01:43:28.572405499 +0000 UTC m=+2965.376623134" Mar 11 01:43:29 crc kubenswrapper[4744]: I0311 01:43:29.243055 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:29 crc kubenswrapper[4744]: I0311 01:43:29.320936 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.204563 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mlwkf" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="registry-server" containerID="cri-o://1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05" gracePeriod=2 Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.593531 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.735296 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content\") pod \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.735481 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kf67\" (UniqueName: \"kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67\") pod \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.735638 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities\") pod \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\" (UID: \"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc\") " Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.737046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities" (OuterVolumeSpecName: "utilities") pod "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" (UID: "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.744029 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67" (OuterVolumeSpecName: "kube-api-access-4kf67") pod "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" (UID: "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc"). InnerVolumeSpecName "kube-api-access-4kf67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.825753 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" (UID: "db464b17-3e7d-4e22-9fa8-3b16ad7d43dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.838216 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kf67\" (UniqueName: \"kubernetes.io/projected/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-kube-api-access-4kf67\") on node \"crc\" DevicePath \"\"" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.838266 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:43:31 crc kubenswrapper[4744]: I0311 01:43:31.838286 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.217405 4744 generic.go:334] "Generic (PLEG): container finished" podID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerID="1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05" exitCode=0 Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.217483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerDied","Data":"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05"} Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.217616 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlwkf" event={"ID":"db464b17-3e7d-4e22-9fa8-3b16ad7d43dc","Type":"ContainerDied","Data":"066ddfcd614305d8c39afac12b685fabbc331f0bb648d9753e0cc18ee41f4ec0"} Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.217542 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlwkf" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.217663 4744 scope.go:117] "RemoveContainer" containerID="1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.259325 4744 scope.go:117] "RemoveContainer" containerID="330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.268093 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.287433 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mlwkf"] Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.294039 4744 scope.go:117] "RemoveContainer" containerID="28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.331225 4744 scope.go:117] "RemoveContainer" containerID="1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05" Mar 11 01:43:32 crc kubenswrapper[4744]: E0311 01:43:32.334035 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05\": container with ID starting with 1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05 not found: ID does not exist" containerID="1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.334072 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05"} err="failed to get container status \"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05\": rpc error: code = NotFound desc = could not find container \"1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05\": container with ID starting with 1c719221267682e75e5bf1fd595e1d307f61e4e1b6fb7a0963b51affdce61e05 not found: ID does not exist" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.334098 4744 scope.go:117] "RemoveContainer" containerID="330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70" Mar 11 01:43:32 crc kubenswrapper[4744]: E0311 01:43:32.334625 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70\": container with ID starting with 330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70 not found: ID does not exist" containerID="330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.334683 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70"} err="failed to get container status \"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70\": rpc error: code = NotFound desc = could not find container \"330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70\": container with ID starting with 330cc92cdccb045dccf28eabd1e951bc46a471bb5a4a07c23be013f8138c1a70 not found: ID does not exist" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.334724 4744 scope.go:117] "RemoveContainer" containerID="28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736" Mar 11 01:43:32 crc kubenswrapper[4744]: E0311 01:43:32.335084 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736\": container with ID starting with 28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736 not found: ID does not exist" containerID="28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736" Mar 11 01:43:32 crc kubenswrapper[4744]: I0311 01:43:32.335109 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736"} err="failed to get container status \"28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736\": rpc error: code = NotFound desc = could not find container \"28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736\": container with ID starting with 28455b36bb25d8768411f5c1b9bfcead1bd18d311d616c7b6d5da353840e6736 not found: ID does not exist" Mar 11 01:43:33 crc kubenswrapper[4744]: I0311 01:43:33.989045 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" path="/var/lib/kubelet/pods/db464b17-3e7d-4e22-9fa8-3b16ad7d43dc/volumes" Mar 11 01:43:39 crc kubenswrapper[4744]: I0311 01:43:39.975606 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:43:39 crc kubenswrapper[4744]: E0311 01:43:39.976442 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:43:50 crc kubenswrapper[4744]: I0311 01:43:50.975553 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:43:50 crc kubenswrapper[4744]: E0311 01:43:50.976504 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.162679 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553224-4tpwf"] Mar 11 01:44:00 crc kubenswrapper[4744]: E0311 01:44:00.163805 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="extract-utilities" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.163827 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="extract-utilities" Mar 11 01:44:00 crc kubenswrapper[4744]: E0311 01:44:00.163866 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="registry-server" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.163878 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="registry-server" Mar 11 01:44:00 crc kubenswrapper[4744]: E0311 01:44:00.163918 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="extract-content" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.163931 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="extract-content" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.164182 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="db464b17-3e7d-4e22-9fa8-3b16ad7d43dc" containerName="registry-server" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.164953 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.168208 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.169219 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.171354 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.173212 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553224-4tpwf"] Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.205647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtvd\" (UniqueName: \"kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd\") pod \"auto-csr-approver-29553224-4tpwf\" (UID: \"aa853d2f-bfde-4693-8051-3fe00e0765c4\") " pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.307382 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtvd\" (UniqueName: \"kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd\") pod \"auto-csr-approver-29553224-4tpwf\" (UID: \"aa853d2f-bfde-4693-8051-3fe00e0765c4\") " pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.331548 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtvd\" (UniqueName: \"kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd\") pod \"auto-csr-approver-29553224-4tpwf\" (UID: \"aa853d2f-bfde-4693-8051-3fe00e0765c4\") " pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:00 crc kubenswrapper[4744]: I0311 01:44:00.503594 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:01 crc kubenswrapper[4744]: I0311 01:44:01.029837 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553224-4tpwf"] Mar 11 01:44:01 crc kubenswrapper[4744]: I0311 01:44:01.475005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" event={"ID":"aa853d2f-bfde-4693-8051-3fe00e0765c4","Type":"ContainerStarted","Data":"033a38820ba8fa92f887fef1a9cf56d3a8394f81684a192d8bcbaf05d42e6869"} Mar 11 01:44:02 crc kubenswrapper[4744]: I0311 01:44:02.499081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" event={"ID":"aa853d2f-bfde-4693-8051-3fe00e0765c4","Type":"ContainerStarted","Data":"a73fd894b6aa001e8235162745d5db82e6d564ea8b8969148854f1fdc2a01287"} Mar 11 01:44:02 crc kubenswrapper[4744]: I0311 01:44:02.522877 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" podStartSLOduration=1.56953019 podStartE2EDuration="2.522844954s" podCreationTimestamp="2026-03-11 01:44:00 +0000 UTC" firstStartedPulling="2026-03-11 01:44:01.03734913 +0000 UTC m=+2997.841566765" lastFinishedPulling="2026-03-11 01:44:01.990663884 +0000 UTC m=+2998.794881529" observedRunningTime="2026-03-11 01:44:02.515791011 +0000 UTC m=+2999.320008626" watchObservedRunningTime="2026-03-11 01:44:02.522844954 +0000 UTC m=+2999.327062609" Mar 11 01:44:02 crc kubenswrapper[4744]: I0311 01:44:02.974249 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:44:02 crc kubenswrapper[4744]: E0311 01:44:02.974455 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:44:03 crc kubenswrapper[4744]: I0311 01:44:03.512382 4744 generic.go:334] "Generic (PLEG): container finished" podID="aa853d2f-bfde-4693-8051-3fe00e0765c4" containerID="a73fd894b6aa001e8235162745d5db82e6d564ea8b8969148854f1fdc2a01287" exitCode=0 Mar 11 01:44:03 crc kubenswrapper[4744]: I0311 01:44:03.512442 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" event={"ID":"aa853d2f-bfde-4693-8051-3fe00e0765c4","Type":"ContainerDied","Data":"a73fd894b6aa001e8235162745d5db82e6d564ea8b8969148854f1fdc2a01287"} Mar 11 01:44:04 crc kubenswrapper[4744]: I0311 01:44:04.838544 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.012107 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtvd\" (UniqueName: \"kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd\") pod \"aa853d2f-bfde-4693-8051-3fe00e0765c4\" (UID: \"aa853d2f-bfde-4693-8051-3fe00e0765c4\") " Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.019981 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd" (OuterVolumeSpecName: "kube-api-access-4wtvd") pod "aa853d2f-bfde-4693-8051-3fe00e0765c4" (UID: "aa853d2f-bfde-4693-8051-3fe00e0765c4"). InnerVolumeSpecName "kube-api-access-4wtvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.114761 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtvd\" (UniqueName: \"kubernetes.io/projected/aa853d2f-bfde-4693-8051-3fe00e0765c4-kube-api-access-4wtvd\") on node \"crc\" DevicePath \"\"" Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.533937 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" event={"ID":"aa853d2f-bfde-4693-8051-3fe00e0765c4","Type":"ContainerDied","Data":"033a38820ba8fa92f887fef1a9cf56d3a8394f81684a192d8bcbaf05d42e6869"} Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.534573 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="033a38820ba8fa92f887fef1a9cf56d3a8394f81684a192d8bcbaf05d42e6869" Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.533970 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553224-4tpwf" Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.601570 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553218-xvms7"] Mar 11 01:44:05 crc kubenswrapper[4744]: I0311 01:44:05.612137 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553218-xvms7"] Mar 11 01:44:06 crc kubenswrapper[4744]: I0311 01:44:06.011440 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a28442-2555-46ff-9a86-a8fe2e583e5e" path="/var/lib/kubelet/pods/27a28442-2555-46ff-9a86-a8fe2e583e5e/volumes" Mar 11 01:44:12 crc kubenswrapper[4744]: I0311 01:44:12.394262 4744 scope.go:117] "RemoveContainer" containerID="d78a15f9940a6fd40154c6e1f66ce64b190f37f3669b4e84772a754903a34f60" Mar 11 01:44:14 crc kubenswrapper[4744]: I0311 01:44:14.974785 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:44:14 crc kubenswrapper[4744]: E0311 01:44:14.975206 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:44:29 crc kubenswrapper[4744]: I0311 01:44:29.974867 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:44:29 crc kubenswrapper[4744]: E0311 01:44:29.977055 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:44:44 crc kubenswrapper[4744]: I0311 01:44:44.975783 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:44:44 crc kubenswrapper[4744]: E0311 01:44:44.976668 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:44:57 crc kubenswrapper[4744]: I0311 01:44:57.975239 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:44:57 crc kubenswrapper[4744]: E0311 01:44:57.976319 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.178291 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2"] Mar 11 01:45:00 crc kubenswrapper[4744]: E0311 01:45:00.178845 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa853d2f-bfde-4693-8051-3fe00e0765c4" containerName="oc" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.178871 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa853d2f-bfde-4693-8051-3fe00e0765c4" containerName="oc" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.179139 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa853d2f-bfde-4693-8051-3fe00e0765c4" containerName="oc" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.179860 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.183587 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.185414 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.189766 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2"] Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.271044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gl7p\" (UniqueName: \"kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.271113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.271303 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.373824 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gl7p\" (UniqueName: \"kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.373914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.374018 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.380193 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.393426 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.414562 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gl7p\" (UniqueName: \"kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p\") pod \"collect-profiles-29553225-5xsk2\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.510492 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:00 crc kubenswrapper[4744]: I0311 01:45:00.998277 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2"] Mar 11 01:45:01 crc kubenswrapper[4744]: I0311 01:45:01.066243 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" event={"ID":"10a8af84-d779-4bd2-860d-adc73f3b225f","Type":"ContainerStarted","Data":"a0d822d1b355acfac354637995b135e6d2e4452b34c219550adc5b137ce9d7ee"} Mar 11 01:45:02 crc kubenswrapper[4744]: I0311 01:45:02.078676 4744 generic.go:334] "Generic (PLEG): container finished" podID="10a8af84-d779-4bd2-860d-adc73f3b225f" containerID="1bba16946fdfb8a679db4d8e1f701b175df943377be9bd0206938c6bc0d528e1" exitCode=0 Mar 11 01:45:02 crc kubenswrapper[4744]: I0311 01:45:02.078740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" event={"ID":"10a8af84-d779-4bd2-860d-adc73f3b225f","Type":"ContainerDied","Data":"1bba16946fdfb8a679db4d8e1f701b175df943377be9bd0206938c6bc0d528e1"} Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.544933 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.625071 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gl7p\" (UniqueName: \"kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p\") pod \"10a8af84-d779-4bd2-860d-adc73f3b225f\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.625182 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume\") pod \"10a8af84-d779-4bd2-860d-adc73f3b225f\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.625226 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume\") pod \"10a8af84-d779-4bd2-860d-adc73f3b225f\" (UID: \"10a8af84-d779-4bd2-860d-adc73f3b225f\") " Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.626630 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume" (OuterVolumeSpecName: "config-volume") pod "10a8af84-d779-4bd2-860d-adc73f3b225f" (UID: "10a8af84-d779-4bd2-860d-adc73f3b225f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.633090 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10a8af84-d779-4bd2-860d-adc73f3b225f" (UID: "10a8af84-d779-4bd2-860d-adc73f3b225f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.635773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p" (OuterVolumeSpecName: "kube-api-access-7gl7p") pod "10a8af84-d779-4bd2-860d-adc73f3b225f" (UID: "10a8af84-d779-4bd2-860d-adc73f3b225f"). InnerVolumeSpecName "kube-api-access-7gl7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.726579 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10a8af84-d779-4bd2-860d-adc73f3b225f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.726925 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10a8af84-d779-4bd2-860d-adc73f3b225f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 01:45:03 crc kubenswrapper[4744]: I0311 01:45:03.726945 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gl7p\" (UniqueName: \"kubernetes.io/projected/10a8af84-d779-4bd2-860d-adc73f3b225f-kube-api-access-7gl7p\") on node \"crc\" DevicePath \"\"" Mar 11 01:45:04 crc kubenswrapper[4744]: I0311 01:45:04.103434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" event={"ID":"10a8af84-d779-4bd2-860d-adc73f3b225f","Type":"ContainerDied","Data":"a0d822d1b355acfac354637995b135e6d2e4452b34c219550adc5b137ce9d7ee"} Mar 11 01:45:04 crc kubenswrapper[4744]: I0311 01:45:04.103470 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0d822d1b355acfac354637995b135e6d2e4452b34c219550adc5b137ce9d7ee" Mar 11 01:45:04 crc kubenswrapper[4744]: I0311 01:45:04.103556 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2" Mar 11 01:45:04 crc kubenswrapper[4744]: I0311 01:45:04.631141 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb"] Mar 11 01:45:04 crc kubenswrapper[4744]: I0311 01:45:04.643317 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553180-h7hsb"] Mar 11 01:45:06 crc kubenswrapper[4744]: I0311 01:45:05.986000 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6" path="/var/lib/kubelet/pods/dde8d0af-13f0-4eda-93e5-1bb4a99ac0d6/volumes" Mar 11 01:45:08 crc kubenswrapper[4744]: I0311 01:45:08.975148 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:45:08 crc kubenswrapper[4744]: E0311 01:45:08.976802 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:45:12 crc kubenswrapper[4744]: I0311 01:45:12.524884 4744 scope.go:117] "RemoveContainer" containerID="b7de441c4af0cf24e698bfb4e00dc6fb117b9da86dde2ec839f5525f03ee4ccb" Mar 11 01:45:22 crc kubenswrapper[4744]: I0311 01:45:22.975190 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:45:22 crc kubenswrapper[4744]: E0311 01:45:22.976306 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:45:37 crc kubenswrapper[4744]: I0311 01:45:37.978797 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:45:37 crc kubenswrapper[4744]: E0311 01:45:37.979793 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:45:51 crc kubenswrapper[4744]: I0311 01:45:51.975224 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:45:51 crc kubenswrapper[4744]: E0311 01:45:51.976840 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.163489 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553226-mqlxl"] Mar 11 01:46:00 crc kubenswrapper[4744]: E0311 01:46:00.164892 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a8af84-d779-4bd2-860d-adc73f3b225f" containerName="collect-profiles" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.164924 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a8af84-d779-4bd2-860d-adc73f3b225f" containerName="collect-profiles" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.165264 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a8af84-d779-4bd2-860d-adc73f3b225f" containerName="collect-profiles" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.166296 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.170541 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.170997 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.171081 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.177128 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553226-mqlxl"] Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.253733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2fb\" (UniqueName: \"kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb\") pod \"auto-csr-approver-29553226-mqlxl\" (UID: \"a8c1b1ee-62f1-4606-879b-21a9ce892564\") " pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.355545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2fb\" (UniqueName: \"kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb\") pod \"auto-csr-approver-29553226-mqlxl\" (UID: \"a8c1b1ee-62f1-4606-879b-21a9ce892564\") " pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.390686 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2fb\" (UniqueName: \"kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb\") pod \"auto-csr-approver-29553226-mqlxl\" (UID: \"a8c1b1ee-62f1-4606-879b-21a9ce892564\") " pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:00 crc kubenswrapper[4744]: I0311 01:46:00.502909 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:01 crc kubenswrapper[4744]: I0311 01:46:01.015467 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553226-mqlxl"] Mar 11 01:46:01 crc kubenswrapper[4744]: I0311 01:46:01.675930 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" event={"ID":"a8c1b1ee-62f1-4606-879b-21a9ce892564","Type":"ContainerStarted","Data":"8c2d81f7bcebb6ace756393bc9a996ab795055731e87690485ae5e439fc964c4"} Mar 11 01:46:02 crc kubenswrapper[4744]: I0311 01:46:02.688295 4744 generic.go:334] "Generic (PLEG): container finished" podID="a8c1b1ee-62f1-4606-879b-21a9ce892564" containerID="a06be365853f07689b9cabd9d0bab42a4a4b63d8a15d44d86e370de6d88bfeeb" exitCode=0 Mar 11 01:46:02 crc kubenswrapper[4744]: I0311 01:46:02.688383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" event={"ID":"a8c1b1ee-62f1-4606-879b-21a9ce892564","Type":"ContainerDied","Data":"a06be365853f07689b9cabd9d0bab42a4a4b63d8a15d44d86e370de6d88bfeeb"} Mar 11 01:46:02 crc kubenswrapper[4744]: I0311 01:46:02.975954 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:46:02 crc kubenswrapper[4744]: E0311 01:46:02.976807 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:46:03 crc kubenswrapper[4744]: I0311 01:46:03.953491 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.017330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2fb\" (UniqueName: \"kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb\") pod \"a8c1b1ee-62f1-4606-879b-21a9ce892564\" (UID: \"a8c1b1ee-62f1-4606-879b-21a9ce892564\") " Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.024909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb" (OuterVolumeSpecName: "kube-api-access-zt2fb") pod "a8c1b1ee-62f1-4606-879b-21a9ce892564" (UID: "a8c1b1ee-62f1-4606-879b-21a9ce892564"). InnerVolumeSpecName "kube-api-access-zt2fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.119035 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2fb\" (UniqueName: \"kubernetes.io/projected/a8c1b1ee-62f1-4606-879b-21a9ce892564-kube-api-access-zt2fb\") on node \"crc\" DevicePath \"\"" Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.712494 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" event={"ID":"a8c1b1ee-62f1-4606-879b-21a9ce892564","Type":"ContainerDied","Data":"8c2d81f7bcebb6ace756393bc9a996ab795055731e87690485ae5e439fc964c4"} Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.713011 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c2d81f7bcebb6ace756393bc9a996ab795055731e87690485ae5e439fc964c4" Mar 11 01:46:04 crc kubenswrapper[4744]: I0311 01:46:04.712635 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553226-mqlxl" Mar 11 01:46:05 crc kubenswrapper[4744]: I0311 01:46:05.040233 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553220-8ztsx"] Mar 11 01:46:05 crc kubenswrapper[4744]: I0311 01:46:05.044663 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553220-8ztsx"] Mar 11 01:46:05 crc kubenswrapper[4744]: I0311 01:46:05.991845 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14492823-13cb-4a93-aa2c-e58f5e612098" path="/var/lib/kubelet/pods/14492823-13cb-4a93-aa2c-e58f5e612098/volumes" Mar 11 01:46:12 crc kubenswrapper[4744]: I0311 01:46:12.598687 4744 scope.go:117] "RemoveContainer" containerID="e486597083f1c8cefd8f0269e7c831f2670b449d8fd439360ac49a3bd8a67c4b" Mar 11 01:46:14 crc kubenswrapper[4744]: I0311 01:46:14.975320 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:46:14 crc kubenswrapper[4744]: E0311 01:46:14.976382 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:46:25 crc kubenswrapper[4744]: I0311 01:46:25.975698 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:46:25 crc kubenswrapper[4744]: E0311 01:46:25.976395 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:46:37 crc kubenswrapper[4744]: I0311 01:46:37.975475 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:46:37 crc kubenswrapper[4744]: E0311 01:46:37.976568 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:46:48 crc kubenswrapper[4744]: I0311 01:46:48.974889 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:46:48 crc kubenswrapper[4744]: E0311 01:46:48.977777 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:47:02 crc kubenswrapper[4744]: I0311 01:47:02.975006 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:47:02 crc kubenswrapper[4744]: E0311 01:47:02.976505 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:47:13 crc kubenswrapper[4744]: I0311 01:47:13.979395 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:47:13 crc kubenswrapper[4744]: E0311 01:47:13.980586 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:47:28 crc kubenswrapper[4744]: I0311 01:47:28.975149 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:47:28 crc kubenswrapper[4744]: E0311 01:47:28.976086 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:47:41 crc kubenswrapper[4744]: I0311 01:47:41.975281 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:47:41 crc kubenswrapper[4744]: E0311 01:47:41.976273 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:47:53 crc kubenswrapper[4744]: I0311 01:47:53.981112 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:47:53 crc kubenswrapper[4744]: E0311 01:47:53.982025 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.169590 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553228-h8cj7"] Mar 11 01:48:00 crc kubenswrapper[4744]: E0311 01:48:00.171069 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c1b1ee-62f1-4606-879b-21a9ce892564" containerName="oc" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.171094 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c1b1ee-62f1-4606-879b-21a9ce892564" containerName="oc" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.171675 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c1b1ee-62f1-4606-879b-21a9ce892564" containerName="oc" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.172758 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.182410 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.182511 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.192355 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553228-h8cj7"] Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.192855 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.305481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd224\" (UniqueName: \"kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224\") pod \"auto-csr-approver-29553228-h8cj7\" (UID: \"0ea213b0-d4b8-4b01-9e10-11a8a4436b93\") " pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.406983 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd224\" (UniqueName: \"kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224\") pod \"auto-csr-approver-29553228-h8cj7\" (UID: \"0ea213b0-d4b8-4b01-9e10-11a8a4436b93\") " pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.444212 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd224\" (UniqueName: \"kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224\") pod \"auto-csr-approver-29553228-h8cj7\" (UID: \"0ea213b0-d4b8-4b01-9e10-11a8a4436b93\") " pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.509983 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.810805 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553228-h8cj7"] Mar 11 01:48:00 crc kubenswrapper[4744]: I0311 01:48:00.856704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" event={"ID":"0ea213b0-d4b8-4b01-9e10-11a8a4436b93","Type":"ContainerStarted","Data":"c4dac32540c12b553a2f57170ec289e1436a32be3e5aeb75c1772870f561d38c"} Mar 11 01:48:02 crc kubenswrapper[4744]: I0311 01:48:02.881513 4744 generic.go:334] "Generic (PLEG): container finished" podID="0ea213b0-d4b8-4b01-9e10-11a8a4436b93" containerID="17f70f4a7a6c62bd3b76870eaa59e9325ad578e1685a6b378a86cf54a3ecce90" exitCode=0 Mar 11 01:48:02 crc kubenswrapper[4744]: I0311 01:48:02.881766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" event={"ID":"0ea213b0-d4b8-4b01-9e10-11a8a4436b93","Type":"ContainerDied","Data":"17f70f4a7a6c62bd3b76870eaa59e9325ad578e1685a6b378a86cf54a3ecce90"} Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.256817 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.375733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd224\" (UniqueName: \"kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224\") pod \"0ea213b0-d4b8-4b01-9e10-11a8a4436b93\" (UID: \"0ea213b0-d4b8-4b01-9e10-11a8a4436b93\") " Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.389779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224" (OuterVolumeSpecName: "kube-api-access-hd224") pod "0ea213b0-d4b8-4b01-9e10-11a8a4436b93" (UID: "0ea213b0-d4b8-4b01-9e10-11a8a4436b93"). InnerVolumeSpecName "kube-api-access-hd224". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.478175 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd224\" (UniqueName: \"kubernetes.io/projected/0ea213b0-d4b8-4b01-9e10-11a8a4436b93-kube-api-access-hd224\") on node \"crc\" DevicePath \"\"" Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.900987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" event={"ID":"0ea213b0-d4b8-4b01-9e10-11a8a4436b93","Type":"ContainerDied","Data":"c4dac32540c12b553a2f57170ec289e1436a32be3e5aeb75c1772870f561d38c"} Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.901045 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4dac32540c12b553a2f57170ec289e1436a32be3e5aeb75c1772870f561d38c" Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.901065 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553228-h8cj7" Mar 11 01:48:04 crc kubenswrapper[4744]: I0311 01:48:04.975137 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:48:04 crc kubenswrapper[4744]: E0311 01:48:04.975884 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:48:05 crc kubenswrapper[4744]: I0311 01:48:05.349592 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553222-sr74p"] Mar 11 01:48:05 crc kubenswrapper[4744]: I0311 01:48:05.361573 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553222-sr74p"] Mar 11 01:48:05 crc kubenswrapper[4744]: I0311 01:48:05.991735 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0888068d-7429-4c76-8a78-5c994a15b419" path="/var/lib/kubelet/pods/0888068d-7429-4c76-8a78-5c994a15b419/volumes" Mar 11 01:48:12 crc kubenswrapper[4744]: I0311 01:48:12.722486 4744 scope.go:117] "RemoveContainer" containerID="f8a9c8debdfdd78ef66cdc39d479bf69b6038c07bc3106f6f257fa3d09628f65" Mar 11 01:48:12 crc kubenswrapper[4744]: I0311 01:48:12.793938 4744 scope.go:117] "RemoveContainer" containerID="8e99ccde3cf88165ebb8b1e1bcdd9c24f9d2c0f9cdd7621b48b6aab3983cce87" Mar 11 01:48:12 crc kubenswrapper[4744]: I0311 01:48:12.824844 4744 scope.go:117] "RemoveContainer" containerID="154b51deeafa5b657c89f5d0fbb7c53162316cf1406f08432d6abc84b4224418" Mar 11 01:48:12 crc kubenswrapper[4744]: I0311 01:48:12.871410 4744 scope.go:117] "RemoveContainer" containerID="7f7e38f7d6e379e5c43f5a4515dd7a924d6e7275cd09ad3113dd9f1974425c9b" Mar 11 01:48:18 crc kubenswrapper[4744]: I0311 01:48:18.974731 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:48:20 crc kubenswrapper[4744]: I0311 01:48:20.047702 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b"} Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.208267 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553230-7bp8h"] Mar 11 01:50:00 crc kubenswrapper[4744]: E0311 01:50:00.209072 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea213b0-d4b8-4b01-9e10-11a8a4436b93" containerName="oc" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.209088 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea213b0-d4b8-4b01-9e10-11a8a4436b93" containerName="oc" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.209251 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea213b0-d4b8-4b01-9e10-11a8a4436b93" containerName="oc" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.209810 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.218411 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.218661 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.218669 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.222750 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553230-7bp8h"] Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.304810 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjnfp\" (UniqueName: \"kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp\") pod \"auto-csr-approver-29553230-7bp8h\" (UID: \"b4192f44-4f7d-4eeb-8993-7acda73ff091\") " pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.406735 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjnfp\" (UniqueName: \"kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp\") pod \"auto-csr-approver-29553230-7bp8h\" (UID: \"b4192f44-4f7d-4eeb-8993-7acda73ff091\") " pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.440891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjnfp\" (UniqueName: \"kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp\") pod \"auto-csr-approver-29553230-7bp8h\" (UID: \"b4192f44-4f7d-4eeb-8993-7acda73ff091\") " pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:00 crc kubenswrapper[4744]: I0311 01:50:00.575422 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:01 crc kubenswrapper[4744]: I0311 01:50:01.078786 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553230-7bp8h"] Mar 11 01:50:01 crc kubenswrapper[4744]: I0311 01:50:01.088102 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:50:02 crc kubenswrapper[4744]: I0311 01:50:02.043841 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" event={"ID":"b4192f44-4f7d-4eeb-8993-7acda73ff091","Type":"ContainerStarted","Data":"ce009e2df93894ff7a8fe3ec82fbe2f4321d584f7b2b8066ad9e5b510312f0ca"} Mar 11 01:50:03 crc kubenswrapper[4744]: I0311 01:50:03.057270 4744 generic.go:334] "Generic (PLEG): container finished" podID="b4192f44-4f7d-4eeb-8993-7acda73ff091" containerID="5b93e13a3f00d4c69fac3858533b230fbde4cf10da2e3d15bd9120b55382e0a5" exitCode=0 Mar 11 01:50:03 crc kubenswrapper[4744]: I0311 01:50:03.057405 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" event={"ID":"b4192f44-4f7d-4eeb-8993-7acda73ff091","Type":"ContainerDied","Data":"5b93e13a3f00d4c69fac3858533b230fbde4cf10da2e3d15bd9120b55382e0a5"} Mar 11 01:50:04 crc kubenswrapper[4744]: I0311 01:50:04.442443 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:04 crc kubenswrapper[4744]: I0311 01:50:04.576069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjnfp\" (UniqueName: \"kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp\") pod \"b4192f44-4f7d-4eeb-8993-7acda73ff091\" (UID: \"b4192f44-4f7d-4eeb-8993-7acda73ff091\") " Mar 11 01:50:04 crc kubenswrapper[4744]: I0311 01:50:04.584926 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp" (OuterVolumeSpecName: "kube-api-access-bjnfp") pod "b4192f44-4f7d-4eeb-8993-7acda73ff091" (UID: "b4192f44-4f7d-4eeb-8993-7acda73ff091"). InnerVolumeSpecName "kube-api-access-bjnfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:50:04 crc kubenswrapper[4744]: I0311 01:50:04.679121 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjnfp\" (UniqueName: \"kubernetes.io/projected/b4192f44-4f7d-4eeb-8993-7acda73ff091-kube-api-access-bjnfp\") on node \"crc\" DevicePath \"\"" Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.091576 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" event={"ID":"b4192f44-4f7d-4eeb-8993-7acda73ff091","Type":"ContainerDied","Data":"ce009e2df93894ff7a8fe3ec82fbe2f4321d584f7b2b8066ad9e5b510312f0ca"} Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.091635 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce009e2df93894ff7a8fe3ec82fbe2f4321d584f7b2b8066ad9e5b510312f0ca" Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.091743 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553230-7bp8h" Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.534147 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553224-4tpwf"] Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.544989 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553224-4tpwf"] Mar 11 01:50:05 crc kubenswrapper[4744]: I0311 01:50:05.989084 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa853d2f-bfde-4693-8051-3fe00e0765c4" path="/var/lib/kubelet/pods/aa853d2f-bfde-4693-8051-3fe00e0765c4/volumes" Mar 11 01:50:12 crc kubenswrapper[4744]: I0311 01:50:12.991767 4744 scope.go:117] "RemoveContainer" containerID="a73fd894b6aa001e8235162745d5db82e6d564ea8b8969148854f1fdc2a01287" Mar 11 01:50:42 crc kubenswrapper[4744]: I0311 01:50:42.413225 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:50:42 crc kubenswrapper[4744]: I0311 01:50:42.413921 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:51:12 crc kubenswrapper[4744]: I0311 01:51:12.409621 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:51:12 crc kubenswrapper[4744]: I0311 01:51:12.411664 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:51:42 crc kubenswrapper[4744]: I0311 01:51:42.409641 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:51:42 crc kubenswrapper[4744]: I0311 01:51:42.410357 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:51:42 crc kubenswrapper[4744]: I0311 01:51:42.410445 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:51:42 crc kubenswrapper[4744]: I0311 01:51:42.411638 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:51:42 crc kubenswrapper[4744]: I0311 01:51:42.411823 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b" gracePeriod=600 Mar 11 01:51:43 crc kubenswrapper[4744]: I0311 01:51:43.069887 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b" exitCode=0 Mar 11 01:51:43 crc kubenswrapper[4744]: I0311 01:51:43.069974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b"} Mar 11 01:51:43 crc kubenswrapper[4744]: I0311 01:51:43.070660 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6"} Mar 11 01:51:43 crc kubenswrapper[4744]: I0311 01:51:43.070711 4744 scope.go:117] "RemoveContainer" containerID="18b99f474de9d4211cad3e91ba128e1f205b77df847433803614c3a4dec5d0d1" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.168563 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553232-jlqk4"] Mar 11 01:52:00 crc kubenswrapper[4744]: E0311 01:52:00.169792 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4192f44-4f7d-4eeb-8993-7acda73ff091" containerName="oc" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.169816 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4192f44-4f7d-4eeb-8993-7acda73ff091" containerName="oc" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.170068 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4192f44-4f7d-4eeb-8993-7acda73ff091" containerName="oc" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.170780 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.174403 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.174700 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.174901 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.184927 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553232-jlqk4"] Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.221697 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp6z\" (UniqueName: \"kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z\") pod \"auto-csr-approver-29553232-jlqk4\" (UID: \"8e20982c-bc31-4d0a-b891-70bc79483ecf\") " pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.323050 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp6z\" (UniqueName: \"kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z\") pod \"auto-csr-approver-29553232-jlqk4\" (UID: \"8e20982c-bc31-4d0a-b891-70bc79483ecf\") " pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.356780 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp6z\" (UniqueName: \"kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z\") pod \"auto-csr-approver-29553232-jlqk4\" (UID: \"8e20982c-bc31-4d0a-b891-70bc79483ecf\") " pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.504865 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:00 crc kubenswrapper[4744]: I0311 01:52:00.801440 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553232-jlqk4"] Mar 11 01:52:01 crc kubenswrapper[4744]: I0311 01:52:01.302019 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" event={"ID":"8e20982c-bc31-4d0a-b891-70bc79483ecf","Type":"ContainerStarted","Data":"0bb31015dc3b4f78bf1d57750714913be3644e32434ac17ec07fcde595a4ea0b"} Mar 11 01:52:02 crc kubenswrapper[4744]: I0311 01:52:02.312066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" event={"ID":"8e20982c-bc31-4d0a-b891-70bc79483ecf","Type":"ContainerStarted","Data":"170b009a918db3c665e0bb2a8e364d49010c87a04e005840051d7a7a56b82ec4"} Mar 11 01:52:02 crc kubenswrapper[4744]: I0311 01:52:02.333124 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" podStartSLOduration=1.331750393 podStartE2EDuration="2.333110397s" podCreationTimestamp="2026-03-11 01:52:00 +0000 UTC" firstStartedPulling="2026-03-11 01:52:00.808930147 +0000 UTC m=+3477.613147792" lastFinishedPulling="2026-03-11 01:52:01.810290151 +0000 UTC m=+3478.614507796" observedRunningTime="2026-03-11 01:52:02.3303724 +0000 UTC m=+3479.134590005" watchObservedRunningTime="2026-03-11 01:52:02.333110397 +0000 UTC m=+3479.137328002" Mar 11 01:52:03 crc kubenswrapper[4744]: I0311 01:52:03.330779 4744 generic.go:334] "Generic (PLEG): container finished" podID="8e20982c-bc31-4d0a-b891-70bc79483ecf" containerID="170b009a918db3c665e0bb2a8e364d49010c87a04e005840051d7a7a56b82ec4" exitCode=0 Mar 11 01:52:03 crc kubenswrapper[4744]: I0311 01:52:03.330853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" event={"ID":"8e20982c-bc31-4d0a-b891-70bc79483ecf","Type":"ContainerDied","Data":"170b009a918db3c665e0bb2a8e364d49010c87a04e005840051d7a7a56b82ec4"} Mar 11 01:52:04 crc kubenswrapper[4744]: I0311 01:52:04.748500 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:04 crc kubenswrapper[4744]: I0311 01:52:04.921596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkp6z\" (UniqueName: \"kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z\") pod \"8e20982c-bc31-4d0a-b891-70bc79483ecf\" (UID: \"8e20982c-bc31-4d0a-b891-70bc79483ecf\") " Mar 11 01:52:04 crc kubenswrapper[4744]: I0311 01:52:04.931277 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z" (OuterVolumeSpecName: "kube-api-access-nkp6z") pod "8e20982c-bc31-4d0a-b891-70bc79483ecf" (UID: "8e20982c-bc31-4d0a-b891-70bc79483ecf"). InnerVolumeSpecName "kube-api-access-nkp6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.024790 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkp6z\" (UniqueName: \"kubernetes.io/projected/8e20982c-bc31-4d0a-b891-70bc79483ecf-kube-api-access-nkp6z\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.359006 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" event={"ID":"8e20982c-bc31-4d0a-b891-70bc79483ecf","Type":"ContainerDied","Data":"0bb31015dc3b4f78bf1d57750714913be3644e32434ac17ec07fcde595a4ea0b"} Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.359065 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb31015dc3b4f78bf1d57750714913be3644e32434ac17ec07fcde595a4ea0b" Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.359087 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553232-jlqk4" Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.425630 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553226-mqlxl"] Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.432264 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553226-mqlxl"] Mar 11 01:52:05 crc kubenswrapper[4744]: I0311 01:52:05.991622 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c1b1ee-62f1-4606-879b-21a9ce892564" path="/var/lib/kubelet/pods/a8c1b1ee-62f1-4606-879b-21a9ce892564/volumes" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.150146 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:07 crc kubenswrapper[4744]: E0311 01:52:07.150666 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e20982c-bc31-4d0a-b891-70bc79483ecf" containerName="oc" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.150973 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e20982c-bc31-4d0a-b891-70bc79483ecf" containerName="oc" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.151225 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e20982c-bc31-4d0a-b891-70bc79483ecf" containerName="oc" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.152976 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.158422 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.291066 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.291757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqp7\" (UniqueName: \"kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.292002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.394093 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.394191 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqp7\" (UniqueName: \"kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.394241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.394919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.395153 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.416770 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqp7\" (UniqueName: \"kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7\") pod \"redhat-marketplace-4dk6x\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.504734 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:07 crc kubenswrapper[4744]: I0311 01:52:07.773082 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:08 crc kubenswrapper[4744]: I0311 01:52:08.390344 4744 generic.go:334] "Generic (PLEG): container finished" podID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerID="be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655" exitCode=0 Mar 11 01:52:08 crc kubenswrapper[4744]: I0311 01:52:08.390410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerDied","Data":"be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655"} Mar 11 01:52:08 crc kubenswrapper[4744]: I0311 01:52:08.390451 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerStarted","Data":"a21a1995947414d613b66c2828ecab5b4df9123d637844604d44acc4d43b2737"} Mar 11 01:52:09 crc kubenswrapper[4744]: I0311 01:52:09.402696 4744 generic.go:334] "Generic (PLEG): container finished" podID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerID="6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225" exitCode=0 Mar 11 01:52:09 crc kubenswrapper[4744]: I0311 01:52:09.402814 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerDied","Data":"6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225"} Mar 11 01:52:10 crc kubenswrapper[4744]: I0311 01:52:10.412639 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerStarted","Data":"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba"} Mar 11 01:52:13 crc kubenswrapper[4744]: I0311 01:52:13.106032 4744 scope.go:117] "RemoveContainer" containerID="a06be365853f07689b9cabd9d0bab42a4a4b63d8a15d44d86e370de6d88bfeeb" Mar 11 01:52:17 crc kubenswrapper[4744]: I0311 01:52:17.505735 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:17 crc kubenswrapper[4744]: I0311 01:52:17.506090 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:17 crc kubenswrapper[4744]: I0311 01:52:17.587993 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:17 crc kubenswrapper[4744]: I0311 01:52:17.620879 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4dk6x" podStartSLOduration=9.201760714 podStartE2EDuration="10.62085404s" podCreationTimestamp="2026-03-11 01:52:07 +0000 UTC" firstStartedPulling="2026-03-11 01:52:08.392601283 +0000 UTC m=+3485.196818928" lastFinishedPulling="2026-03-11 01:52:09.811694619 +0000 UTC m=+3486.615912254" observedRunningTime="2026-03-11 01:52:10.440327403 +0000 UTC m=+3487.244545008" watchObservedRunningTime="2026-03-11 01:52:17.62085404 +0000 UTC m=+3494.425071685" Mar 11 01:52:18 crc kubenswrapper[4744]: I0311 01:52:18.569979 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:18 crc kubenswrapper[4744]: I0311 01:52:18.645012 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:20 crc kubenswrapper[4744]: I0311 01:52:20.497827 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4dk6x" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="registry-server" containerID="cri-o://2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba" gracePeriod=2 Mar 11 01:52:20 crc kubenswrapper[4744]: I0311 01:52:20.882629 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.020642 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlqp7\" (UniqueName: \"kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7\") pod \"87b1a45a-8406-4a62-81f9-55bb601e3b65\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.020734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities\") pod \"87b1a45a-8406-4a62-81f9-55bb601e3b65\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.020802 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content\") pod \"87b1a45a-8406-4a62-81f9-55bb601e3b65\" (UID: \"87b1a45a-8406-4a62-81f9-55bb601e3b65\") " Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.022133 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities" (OuterVolumeSpecName: "utilities") pod "87b1a45a-8406-4a62-81f9-55bb601e3b65" (UID: "87b1a45a-8406-4a62-81f9-55bb601e3b65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.026659 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7" (OuterVolumeSpecName: "kube-api-access-tlqp7") pod "87b1a45a-8406-4a62-81f9-55bb601e3b65" (UID: "87b1a45a-8406-4a62-81f9-55bb601e3b65"). InnerVolumeSpecName "kube-api-access-tlqp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.051062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87b1a45a-8406-4a62-81f9-55bb601e3b65" (UID: "87b1a45a-8406-4a62-81f9-55bb601e3b65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.122862 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlqp7\" (UniqueName: \"kubernetes.io/projected/87b1a45a-8406-4a62-81f9-55bb601e3b65-kube-api-access-tlqp7\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.123156 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.123276 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87b1a45a-8406-4a62-81f9-55bb601e3b65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.510847 4744 generic.go:334] "Generic (PLEG): container finished" podID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerID="2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba" exitCode=0 Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.510911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerDied","Data":"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba"} Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.510911 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dk6x" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.510980 4744 scope.go:117] "RemoveContainer" containerID="2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.510961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dk6x" event={"ID":"87b1a45a-8406-4a62-81f9-55bb601e3b65","Type":"ContainerDied","Data":"a21a1995947414d613b66c2828ecab5b4df9123d637844604d44acc4d43b2737"} Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.550452 4744 scope.go:117] "RemoveContainer" containerID="6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.567325 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.584374 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dk6x"] Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.602631 4744 scope.go:117] "RemoveContainer" containerID="be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.625600 4744 scope.go:117] "RemoveContainer" containerID="2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba" Mar 11 01:52:21 crc kubenswrapper[4744]: E0311 01:52:21.631146 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba\": container with ID starting with 2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba not found: ID does not exist" containerID="2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.631205 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba"} err="failed to get container status \"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba\": rpc error: code = NotFound desc = could not find container \"2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba\": container with ID starting with 2f69c74d33d6f495406a5c0a51f27dd8b660f37972eec3d0e2dcf81dff40f7ba not found: ID does not exist" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.631241 4744 scope.go:117] "RemoveContainer" containerID="6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225" Mar 11 01:52:21 crc kubenswrapper[4744]: E0311 01:52:21.631760 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225\": container with ID starting with 6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225 not found: ID does not exist" containerID="6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.631822 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225"} err="failed to get container status \"6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225\": rpc error: code = NotFound desc = could not find container \"6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225\": container with ID starting with 6edaaaa45d1abb50a260305afbc219c3b6a10a9726b0756d4c102654c8553225 not found: ID does not exist" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.631851 4744 scope.go:117] "RemoveContainer" containerID="be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655" Mar 11 01:52:21 crc kubenswrapper[4744]: E0311 01:52:21.632197 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655\": container with ID starting with be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655 not found: ID does not exist" containerID="be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655" Mar 11 01:52:21 crc kubenswrapper[4744]: I0311 01:52:21.632230 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655"} err="failed to get container status \"be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655\": rpc error: code = NotFound desc = could not find container \"be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655\": container with ID starting with be1272a378b2adc5e7db7a0508be60b598689a8acb7ab2271bb61b55caa2f655 not found: ID does not exist" Mar 11 01:52:22 crc kubenswrapper[4744]: I0311 01:52:22.001121 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" path="/var/lib/kubelet/pods/87b1a45a-8406-4a62-81f9-55bb601e3b65/volumes" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.487217 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:30 crc kubenswrapper[4744]: E0311 01:52:30.488145 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="extract-content" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.488166 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="extract-content" Mar 11 01:52:30 crc kubenswrapper[4744]: E0311 01:52:30.488191 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="registry-server" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.488203 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="registry-server" Mar 11 01:52:30 crc kubenswrapper[4744]: E0311 01:52:30.488241 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="extract-utilities" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.488255 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="extract-utilities" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.488486 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b1a45a-8406-4a62-81f9-55bb601e3b65" containerName="registry-server" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.490139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.520725 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.672578 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.672630 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.672673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98f84\" (UniqueName: \"kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.774883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.774939 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.774979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98f84\" (UniqueName: \"kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.775487 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.775548 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.801713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98f84\" (UniqueName: \"kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84\") pod \"certified-operators-m24rz\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:30 crc kubenswrapper[4744]: I0311 01:52:30.833147 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:31 crc kubenswrapper[4744]: I0311 01:52:31.316814 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:31 crc kubenswrapper[4744]: I0311 01:52:31.599192 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerID="5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b" exitCode=0 Mar 11 01:52:31 crc kubenswrapper[4744]: I0311 01:52:31.599244 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerDied","Data":"5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b"} Mar 11 01:52:31 crc kubenswrapper[4744]: I0311 01:52:31.599450 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerStarted","Data":"0563813d18b9c5413d085bb61ba19954872f217cd095910ca301120d891f4337"} Mar 11 01:52:32 crc kubenswrapper[4744]: I0311 01:52:32.610496 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerStarted","Data":"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995"} Mar 11 01:52:33 crc kubenswrapper[4744]: I0311 01:52:33.621993 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerID="f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995" exitCode=0 Mar 11 01:52:33 crc kubenswrapper[4744]: I0311 01:52:33.622069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerDied","Data":"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995"} Mar 11 01:52:34 crc kubenswrapper[4744]: I0311 01:52:34.634012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerStarted","Data":"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797"} Mar 11 01:52:34 crc kubenswrapper[4744]: I0311 01:52:34.662946 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m24rz" podStartSLOduration=2.0883015 podStartE2EDuration="4.662931576s" podCreationTimestamp="2026-03-11 01:52:30 +0000 UTC" firstStartedPulling="2026-03-11 01:52:31.600416226 +0000 UTC m=+3508.404633831" lastFinishedPulling="2026-03-11 01:52:34.175046252 +0000 UTC m=+3510.979263907" observedRunningTime="2026-03-11 01:52:34.658927089 +0000 UTC m=+3511.463144704" watchObservedRunningTime="2026-03-11 01:52:34.662931576 +0000 UTC m=+3511.467149181" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.098066 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.101975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.125184 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.230753 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.230826 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzg5g\" (UniqueName: \"kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.230862 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.332248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.332352 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzg5g\" (UniqueName: \"kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.332408 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.333247 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.333769 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.358364 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzg5g\" (UniqueName: \"kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g\") pod \"redhat-operators-jvgl2\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.430042 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.687069 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.833690 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.833751 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:40 crc kubenswrapper[4744]: I0311 01:52:40.872648 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:41 crc kubenswrapper[4744]: I0311 01:52:41.686358 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerID="f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92" exitCode=0 Mar 11 01:52:41 crc kubenswrapper[4744]: I0311 01:52:41.686663 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerDied","Data":"f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92"} Mar 11 01:52:41 crc kubenswrapper[4744]: I0311 01:52:41.686700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerStarted","Data":"8cfa89dc48f74b19ad50dfa45e57711d39e148d65d0fbbd47d185f30a9048efd"} Mar 11 01:52:41 crc kubenswrapper[4744]: I0311 01:52:41.754270 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:42 crc kubenswrapper[4744]: I0311 01:52:42.697259 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerStarted","Data":"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f"} Mar 11 01:52:43 crc kubenswrapper[4744]: I0311 01:52:43.259610 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:43 crc kubenswrapper[4744]: I0311 01:52:43.709307 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerID="b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f" exitCode=0 Mar 11 01:52:43 crc kubenswrapper[4744]: I0311 01:52:43.709390 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerDied","Data":"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f"} Mar 11 01:52:43 crc kubenswrapper[4744]: I0311 01:52:43.709696 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m24rz" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="registry-server" containerID="cri-o://34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797" gracePeriod=2 Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.178656 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.292099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98f84\" (UniqueName: \"kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84\") pod \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.292247 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities\") pod \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.292279 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content\") pod \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\" (UID: \"1f31f97c-f53c-4fbe-8311-d52e64ea906b\") " Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.295505 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities" (OuterVolumeSpecName: "utilities") pod "1f31f97c-f53c-4fbe-8311-d52e64ea906b" (UID: "1f31f97c-f53c-4fbe-8311-d52e64ea906b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.298931 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84" (OuterVolumeSpecName: "kube-api-access-98f84") pod "1f31f97c-f53c-4fbe-8311-d52e64ea906b" (UID: "1f31f97c-f53c-4fbe-8311-d52e64ea906b"). InnerVolumeSpecName "kube-api-access-98f84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.365396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f31f97c-f53c-4fbe-8311-d52e64ea906b" (UID: "1f31f97c-f53c-4fbe-8311-d52e64ea906b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.395049 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98f84\" (UniqueName: \"kubernetes.io/projected/1f31f97c-f53c-4fbe-8311-d52e64ea906b-kube-api-access-98f84\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.395097 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.395118 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f31f97c-f53c-4fbe-8311-d52e64ea906b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.722853 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerID="34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797" exitCode=0 Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.723316 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerDied","Data":"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797"} Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.723351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m24rz" event={"ID":"1f31f97c-f53c-4fbe-8311-d52e64ea906b","Type":"ContainerDied","Data":"0563813d18b9c5413d085bb61ba19954872f217cd095910ca301120d891f4337"} Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.723371 4744 scope.go:117] "RemoveContainer" containerID="34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.723841 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m24rz" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.731941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerStarted","Data":"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840"} Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.755571 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvgl2" podStartSLOduration=2.269508351 podStartE2EDuration="4.755545558s" podCreationTimestamp="2026-03-11 01:52:40 +0000 UTC" firstStartedPulling="2026-03-11 01:52:41.688171943 +0000 UTC m=+3518.492389548" lastFinishedPulling="2026-03-11 01:52:44.17420914 +0000 UTC m=+3520.978426755" observedRunningTime="2026-03-11 01:52:44.751618923 +0000 UTC m=+3521.555836558" watchObservedRunningTime="2026-03-11 01:52:44.755545558 +0000 UTC m=+3521.559763183" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.777266 4744 scope.go:117] "RemoveContainer" containerID="f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.781871 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.789843 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m24rz"] Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.836412 4744 scope.go:117] "RemoveContainer" containerID="5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.868510 4744 scope.go:117] "RemoveContainer" containerID="34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797" Mar 11 01:52:44 crc kubenswrapper[4744]: E0311 01:52:44.869058 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797\": container with ID starting with 34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797 not found: ID does not exist" containerID="34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.869100 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797"} err="failed to get container status \"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797\": rpc error: code = NotFound desc = could not find container \"34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797\": container with ID starting with 34219bfd0c8eac6c53526dc0b8acc07511df7a4959f58e53c4f8b3bdcaace797 not found: ID does not exist" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.869124 4744 scope.go:117] "RemoveContainer" containerID="f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995" Mar 11 01:52:44 crc kubenswrapper[4744]: E0311 01:52:44.869405 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995\": container with ID starting with f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995 not found: ID does not exist" containerID="f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.869438 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995"} err="failed to get container status \"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995\": rpc error: code = NotFound desc = could not find container \"f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995\": container with ID starting with f7ad4c2cf3ccd9088fd46322652786243388efa33303a4f4223f7cf6688b2995 not found: ID does not exist" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.869460 4744 scope.go:117] "RemoveContainer" containerID="5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b" Mar 11 01:52:44 crc kubenswrapper[4744]: E0311 01:52:44.869732 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b\": container with ID starting with 5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b not found: ID does not exist" containerID="5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b" Mar 11 01:52:44 crc kubenswrapper[4744]: I0311 01:52:44.869761 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b"} err="failed to get container status \"5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b\": rpc error: code = NotFound desc = could not find container \"5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b\": container with ID starting with 5fb8c49fc25163033d58fc4f6eee2e4cf93208fd72b2c425cd132958bf480c3b not found: ID does not exist" Mar 11 01:52:45 crc kubenswrapper[4744]: I0311 01:52:45.982937 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" path="/var/lib/kubelet/pods/1f31f97c-f53c-4fbe-8311-d52e64ea906b/volumes" Mar 11 01:52:50 crc kubenswrapper[4744]: I0311 01:52:50.431320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:50 crc kubenswrapper[4744]: I0311 01:52:50.431693 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:52:51 crc kubenswrapper[4744]: I0311 01:52:51.483483 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvgl2" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="registry-server" probeResult="failure" output=< Mar 11 01:52:51 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 01:52:51 crc kubenswrapper[4744]: > Mar 11 01:53:00 crc kubenswrapper[4744]: I0311 01:53:00.480915 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:53:00 crc kubenswrapper[4744]: I0311 01:53:00.539573 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:53:01 crc kubenswrapper[4744]: I0311 01:53:01.693408 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:53:01 crc kubenswrapper[4744]: I0311 01:53:01.871388 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvgl2" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="registry-server" containerID="cri-o://90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840" gracePeriod=2 Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.338882 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.483091 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content\") pod \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.483248 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzg5g\" (UniqueName: \"kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g\") pod \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.483289 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities\") pod \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\" (UID: \"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd\") " Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.485130 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities" (OuterVolumeSpecName: "utilities") pod "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" (UID: "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.492244 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g" (OuterVolumeSpecName: "kube-api-access-zzg5g") pod "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" (UID: "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd"). InnerVolumeSpecName "kube-api-access-zzg5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.585204 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzg5g\" (UniqueName: \"kubernetes.io/projected/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-kube-api-access-zzg5g\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.585292 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.669716 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" (UID: "ff9de606-72dd-47e2-9d5c-7ede1db0d2fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.687131 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.897907 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerID="90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840" exitCode=0 Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.897996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerDied","Data":"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840"} Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.898018 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvgl2" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.898070 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvgl2" event={"ID":"ff9de606-72dd-47e2-9d5c-7ede1db0d2fd","Type":"ContainerDied","Data":"8cfa89dc48f74b19ad50dfa45e57711d39e148d65d0fbbd47d185f30a9048efd"} Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.898110 4744 scope.go:117] "RemoveContainer" containerID="90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.940256 4744 scope.go:117] "RemoveContainer" containerID="b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f" Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.955130 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.966470 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvgl2"] Mar 11 01:53:02 crc kubenswrapper[4744]: I0311 01:53:02.972492 4744 scope.go:117] "RemoveContainer" containerID="f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.054011 4744 scope.go:117] "RemoveContainer" containerID="90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840" Mar 11 01:53:03 crc kubenswrapper[4744]: E0311 01:53:03.054608 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840\": container with ID starting with 90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840 not found: ID does not exist" containerID="90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.054687 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840"} err="failed to get container status \"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840\": rpc error: code = NotFound desc = could not find container \"90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840\": container with ID starting with 90f7f61821f50ecedf825de74fec8c32898cc95dbf2e0fc0ea5503cf760b8840 not found: ID does not exist" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.054723 4744 scope.go:117] "RemoveContainer" containerID="b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f" Mar 11 01:53:03 crc kubenswrapper[4744]: E0311 01:53:03.055028 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f\": container with ID starting with b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f not found: ID does not exist" containerID="b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.055071 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f"} err="failed to get container status \"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f\": rpc error: code = NotFound desc = could not find container \"b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f\": container with ID starting with b8799502aa2300ab6c880af1b4082cc426bee4152f8478803470e72a23b4797f not found: ID does not exist" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.055119 4744 scope.go:117] "RemoveContainer" containerID="f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92" Mar 11 01:53:03 crc kubenswrapper[4744]: E0311 01:53:03.055634 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92\": container with ID starting with f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92 not found: ID does not exist" containerID="f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.055685 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92"} err="failed to get container status \"f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92\": rpc error: code = NotFound desc = could not find container \"f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92\": container with ID starting with f3146fd60b5c6dfdccb35fe4857e27f6b04482ae5ee6d727fad7150089d9ce92 not found: ID does not exist" Mar 11 01:53:03 crc kubenswrapper[4744]: I0311 01:53:03.989560 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" path="/var/lib/kubelet/pods/ff9de606-72dd-47e2-9d5c-7ede1db0d2fd/volumes" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.203435 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204507 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="extract-content" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204556 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="extract-content" Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204588 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="extract-content" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204600 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="extract-content" Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204623 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="extract-utilities" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204635 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="extract-utilities" Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204662 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="extract-utilities" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204675 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="extract-utilities" Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204700 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204715 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: E0311 01:53:34.204735 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204747 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.204986 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9de606-72dd-47e2-9d5c-7ede1db0d2fd" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.205017 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f31f97c-f53c-4fbe-8311-d52e64ea906b" containerName="registry-server" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.206794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.229696 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.333700 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.334172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.334392 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z92r\" (UniqueName: \"kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.435604 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.435711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z92r\" (UniqueName: \"kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.435858 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.436452 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.437196 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.463476 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z92r\" (UniqueName: \"kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r\") pod \"community-operators-tttjf\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.537208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:34 crc kubenswrapper[4744]: I0311 01:53:34.767671 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:35 crc kubenswrapper[4744]: I0311 01:53:35.338766 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerID="102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35" exitCode=0 Mar 11 01:53:35 crc kubenswrapper[4744]: I0311 01:53:35.338963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerDied","Data":"102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35"} Mar 11 01:53:35 crc kubenswrapper[4744]: I0311 01:53:35.339113 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerStarted","Data":"e42e80e98db9a0635f82c14cb2e6c45a2c6d65a04a719284b16a229b5365c6dd"} Mar 11 01:53:36 crc kubenswrapper[4744]: I0311 01:53:36.359195 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerStarted","Data":"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0"} Mar 11 01:53:37 crc kubenswrapper[4744]: I0311 01:53:37.370446 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerID="89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0" exitCode=0 Mar 11 01:53:37 crc kubenswrapper[4744]: I0311 01:53:37.370530 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerDied","Data":"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0"} Mar 11 01:53:38 crc kubenswrapper[4744]: I0311 01:53:38.381770 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerStarted","Data":"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634"} Mar 11 01:53:38 crc kubenswrapper[4744]: I0311 01:53:38.407699 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tttjf" podStartSLOduration=1.9470364820000001 podStartE2EDuration="4.407678741s" podCreationTimestamp="2026-03-11 01:53:34 +0000 UTC" firstStartedPulling="2026-03-11 01:53:35.34044701 +0000 UTC m=+3572.144664655" lastFinishedPulling="2026-03-11 01:53:37.801089299 +0000 UTC m=+3574.605306914" observedRunningTime="2026-03-11 01:53:38.40704443 +0000 UTC m=+3575.211262055" watchObservedRunningTime="2026-03-11 01:53:38.407678741 +0000 UTC m=+3575.211896376" Mar 11 01:53:42 crc kubenswrapper[4744]: I0311 01:53:42.413396 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:53:42 crc kubenswrapper[4744]: I0311 01:53:42.413883 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:53:44 crc kubenswrapper[4744]: I0311 01:53:44.537570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:44 crc kubenswrapper[4744]: I0311 01:53:44.537943 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:44 crc kubenswrapper[4744]: I0311 01:53:44.602812 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:45 crc kubenswrapper[4744]: I0311 01:53:45.486966 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:45 crc kubenswrapper[4744]: I0311 01:53:45.544897 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:47 crc kubenswrapper[4744]: I0311 01:53:47.458927 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tttjf" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="registry-server" containerID="cri-o://bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634" gracePeriod=2 Mar 11 01:53:47 crc kubenswrapper[4744]: I0311 01:53:47.970188 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.049857 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities\") pod \"7c8a161f-d556-457e-8ec6-799821b45ba9\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.049993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z92r\" (UniqueName: \"kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r\") pod \"7c8a161f-d556-457e-8ec6-799821b45ba9\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.050269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content\") pod \"7c8a161f-d556-457e-8ec6-799821b45ba9\" (UID: \"7c8a161f-d556-457e-8ec6-799821b45ba9\") " Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.052069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities" (OuterVolumeSpecName: "utilities") pod "7c8a161f-d556-457e-8ec6-799821b45ba9" (UID: "7c8a161f-d556-457e-8ec6-799821b45ba9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.057611 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r" (OuterVolumeSpecName: "kube-api-access-6z92r") pod "7c8a161f-d556-457e-8ec6-799821b45ba9" (UID: "7c8a161f-d556-457e-8ec6-799821b45ba9"). InnerVolumeSpecName "kube-api-access-6z92r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.066584 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.066628 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z92r\" (UniqueName: \"kubernetes.io/projected/7c8a161f-d556-457e-8ec6-799821b45ba9-kube-api-access-6z92r\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.142394 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c8a161f-d556-457e-8ec6-799821b45ba9" (UID: "7c8a161f-d556-457e-8ec6-799821b45ba9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.168765 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8a161f-d556-457e-8ec6-799821b45ba9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.471842 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerID="bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634" exitCode=0 Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.471910 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerDied","Data":"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634"} Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.471953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tttjf" event={"ID":"7c8a161f-d556-457e-8ec6-799821b45ba9","Type":"ContainerDied","Data":"e42e80e98db9a0635f82c14cb2e6c45a2c6d65a04a719284b16a229b5365c6dd"} Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.471989 4744 scope.go:117] "RemoveContainer" containerID="bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.471917 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tttjf" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.503449 4744 scope.go:117] "RemoveContainer" containerID="89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.530068 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.542107 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tttjf"] Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.549662 4744 scope.go:117] "RemoveContainer" containerID="102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.582645 4744 scope.go:117] "RemoveContainer" containerID="bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634" Mar 11 01:53:48 crc kubenswrapper[4744]: E0311 01:53:48.584118 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634\": container with ID starting with bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634 not found: ID does not exist" containerID="bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.584166 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634"} err="failed to get container status \"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634\": rpc error: code = NotFound desc = could not find container \"bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634\": container with ID starting with bd3fb9f50bbacee5e18a3bd0bfbd93ade3ae05ef781bd8ec033e9287b9d74634 not found: ID does not exist" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.584193 4744 scope.go:117] "RemoveContainer" containerID="89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0" Mar 11 01:53:48 crc kubenswrapper[4744]: E0311 01:53:48.584946 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0\": container with ID starting with 89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0 not found: ID does not exist" containerID="89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.585007 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0"} err="failed to get container status \"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0\": rpc error: code = NotFound desc = could not find container \"89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0\": container with ID starting with 89eb9571d365c912c8dad3ac6437d3a0450300c5c2ebff81f2828f80ceb744d0 not found: ID does not exist" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.585079 4744 scope.go:117] "RemoveContainer" containerID="102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35" Mar 11 01:53:48 crc kubenswrapper[4744]: E0311 01:53:48.585721 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35\": container with ID starting with 102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35 not found: ID does not exist" containerID="102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35" Mar 11 01:53:48 crc kubenswrapper[4744]: I0311 01:53:48.585752 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35"} err="failed to get container status \"102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35\": rpc error: code = NotFound desc = could not find container \"102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35\": container with ID starting with 102af84b1be851e50987dd327d169e6e8026d19b5a0324987aca7624d654ba35 not found: ID does not exist" Mar 11 01:53:49 crc kubenswrapper[4744]: I0311 01:53:49.994388 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" path="/var/lib/kubelet/pods/7c8a161f-d556-457e-8ec6-799821b45ba9/volumes" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.166351 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553234-929lb"] Mar 11 01:54:00 crc kubenswrapper[4744]: E0311 01:54:00.167380 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="registry-server" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.167401 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="registry-server" Mar 11 01:54:00 crc kubenswrapper[4744]: E0311 01:54:00.167424 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="extract-utilities" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.167436 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="extract-utilities" Mar 11 01:54:00 crc kubenswrapper[4744]: E0311 01:54:00.167486 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="extract-content" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.167500 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="extract-content" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.167770 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8a161f-d556-457e-8ec6-799821b45ba9" containerName="registry-server" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.169896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.173059 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.174404 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.176060 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.181196 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553234-929lb"] Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.277701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxph8\" (UniqueName: \"kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8\") pod \"auto-csr-approver-29553234-929lb\" (UID: \"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf\") " pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.378784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxph8\" (UniqueName: \"kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8\") pod \"auto-csr-approver-29553234-929lb\" (UID: \"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf\") " pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.413838 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxph8\" (UniqueName: \"kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8\") pod \"auto-csr-approver-29553234-929lb\" (UID: \"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf\") " pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:00 crc kubenswrapper[4744]: I0311 01:54:00.497186 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:01 crc kubenswrapper[4744]: I0311 01:54:01.020866 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553234-929lb"] Mar 11 01:54:01 crc kubenswrapper[4744]: I0311 01:54:01.601448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553234-929lb" event={"ID":"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf","Type":"ContainerStarted","Data":"5c31dc9f640a742fccee96b70b373c83217d21b0ccf17303cb21f262f9ab1298"} Mar 11 01:54:02 crc kubenswrapper[4744]: I0311 01:54:02.610161 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553234-929lb" event={"ID":"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf","Type":"ContainerStarted","Data":"113cf4ba5f83f3948b6cb7050999ae5902ca3d99efb39c19ee4499dfcea59e6b"} Mar 11 01:54:02 crc kubenswrapper[4744]: I0311 01:54:02.628300 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553234-929lb" podStartSLOduration=1.436917719 podStartE2EDuration="2.628286459s" podCreationTimestamp="2026-03-11 01:54:00 +0000 UTC" firstStartedPulling="2026-03-11 01:54:01.027332287 +0000 UTC m=+3597.831549902" lastFinishedPulling="2026-03-11 01:54:02.218701037 +0000 UTC m=+3599.022918642" observedRunningTime="2026-03-11 01:54:02.626377379 +0000 UTC m=+3599.430595034" watchObservedRunningTime="2026-03-11 01:54:02.628286459 +0000 UTC m=+3599.432504064" Mar 11 01:54:03 crc kubenswrapper[4744]: I0311 01:54:03.621480 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" containerID="113cf4ba5f83f3948b6cb7050999ae5902ca3d99efb39c19ee4499dfcea59e6b" exitCode=0 Mar 11 01:54:03 crc kubenswrapper[4744]: I0311 01:54:03.621586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553234-929lb" event={"ID":"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf","Type":"ContainerDied","Data":"113cf4ba5f83f3948b6cb7050999ae5902ca3d99efb39c19ee4499dfcea59e6b"} Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.017364 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.071158 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxph8\" (UniqueName: \"kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8\") pod \"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf\" (UID: \"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf\") " Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.080636 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8" (OuterVolumeSpecName: "kube-api-access-fxph8") pod "9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" (UID: "9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf"). InnerVolumeSpecName "kube-api-access-fxph8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.172503 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxph8\" (UniqueName: \"kubernetes.io/projected/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf-kube-api-access-fxph8\") on node \"crc\" DevicePath \"\"" Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.645057 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553234-929lb" event={"ID":"9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf","Type":"ContainerDied","Data":"5c31dc9f640a742fccee96b70b373c83217d21b0ccf17303cb21f262f9ab1298"} Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.645117 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c31dc9f640a742fccee96b70b373c83217d21b0ccf17303cb21f262f9ab1298" Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.645193 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553234-929lb" Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.786954 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553228-h8cj7"] Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.799423 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553228-h8cj7"] Mar 11 01:54:05 crc kubenswrapper[4744]: I0311 01:54:05.991709 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea213b0-d4b8-4b01-9e10-11a8a4436b93" path="/var/lib/kubelet/pods/0ea213b0-d4b8-4b01-9e10-11a8a4436b93/volumes" Mar 11 01:54:12 crc kubenswrapper[4744]: I0311 01:54:12.409489 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:54:12 crc kubenswrapper[4744]: I0311 01:54:12.410249 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:54:13 crc kubenswrapper[4744]: I0311 01:54:13.277220 4744 scope.go:117] "RemoveContainer" containerID="17f70f4a7a6c62bd3b76870eaa59e9325ad578e1685a6b378a86cf54a3ecce90" Mar 11 01:54:42 crc kubenswrapper[4744]: I0311 01:54:42.409855 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 01:54:42 crc kubenswrapper[4744]: I0311 01:54:42.410626 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 01:54:42 crc kubenswrapper[4744]: I0311 01:54:42.410706 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 01:54:42 crc kubenswrapper[4744]: I0311 01:54:42.411586 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 01:54:42 crc kubenswrapper[4744]: I0311 01:54:42.411774 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" gracePeriod=600 Mar 11 01:54:42 crc kubenswrapper[4744]: E0311 01:54:42.539312 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:54:43 crc kubenswrapper[4744]: I0311 01:54:43.007651 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" exitCode=0 Mar 11 01:54:43 crc kubenswrapper[4744]: I0311 01:54:43.007759 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6"} Mar 11 01:54:43 crc kubenswrapper[4744]: I0311 01:54:43.008183 4744 scope.go:117] "RemoveContainer" containerID="2fb0586e3dccedbf88f19f5a2d18df37c5d92d22979e1714c0bb39c539d0e12b" Mar 11 01:54:43 crc kubenswrapper[4744]: I0311 01:54:43.010580 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:54:43 crc kubenswrapper[4744]: E0311 01:54:43.011279 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:54:56 crc kubenswrapper[4744]: I0311 01:54:56.975432 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:54:56 crc kubenswrapper[4744]: E0311 01:54:56.976353 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:55:10 crc kubenswrapper[4744]: I0311 01:55:10.975367 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:55:10 crc kubenswrapper[4744]: E0311 01:55:10.976751 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:55:21 crc kubenswrapper[4744]: I0311 01:55:21.975295 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:55:21 crc kubenswrapper[4744]: E0311 01:55:21.980307 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:55:33 crc kubenswrapper[4744]: I0311 01:55:33.981778 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:55:33 crc kubenswrapper[4744]: E0311 01:55:33.982893 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:55:46 crc kubenswrapper[4744]: I0311 01:55:46.975343 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:55:46 crc kubenswrapper[4744]: E0311 01:55:46.976645 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:55:57 crc kubenswrapper[4744]: I0311 01:55:57.975090 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:55:57 crc kubenswrapper[4744]: E0311 01:55:57.977279 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.162769 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553236-74j6k"] Mar 11 01:56:00 crc kubenswrapper[4744]: E0311 01:56:00.163854 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" containerName="oc" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.163889 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" containerName="oc" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.164216 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" containerName="oc" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.165163 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.168022 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.172643 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.172652 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.186629 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553236-74j6k"] Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.361901 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmd4\" (UniqueName: \"kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4\") pod \"auto-csr-approver-29553236-74j6k\" (UID: \"dae30bbc-1daa-41bd-949c-b97e8db8a318\") " pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.464086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmd4\" (UniqueName: \"kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4\") pod \"auto-csr-approver-29553236-74j6k\" (UID: \"dae30bbc-1daa-41bd-949c-b97e8db8a318\") " pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.497735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmd4\" (UniqueName: \"kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4\") pod \"auto-csr-approver-29553236-74j6k\" (UID: \"dae30bbc-1daa-41bd-949c-b97e8db8a318\") " pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:00 crc kubenswrapper[4744]: I0311 01:56:00.797823 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:01 crc kubenswrapper[4744]: I0311 01:56:01.297417 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553236-74j6k"] Mar 11 01:56:01 crc kubenswrapper[4744]: I0311 01:56:01.302603 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 01:56:01 crc kubenswrapper[4744]: I0311 01:56:01.777394 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553236-74j6k" event={"ID":"dae30bbc-1daa-41bd-949c-b97e8db8a318","Type":"ContainerStarted","Data":"83b77259326aba2fe16d471726573bb88428645e1d16101480a93128e55cba2c"} Mar 11 01:56:02 crc kubenswrapper[4744]: I0311 01:56:02.788604 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553236-74j6k" event={"ID":"dae30bbc-1daa-41bd-949c-b97e8db8a318","Type":"ContainerStarted","Data":"e59c443f7b4a50658d313f92f9775cbd0d9466a72efdb2344e4a466534f6788c"} Mar 11 01:56:02 crc kubenswrapper[4744]: I0311 01:56:02.819643 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553236-74j6k" podStartSLOduration=1.717109404 podStartE2EDuration="2.819607427s" podCreationTimestamp="2026-03-11 01:56:00 +0000 UTC" firstStartedPulling="2026-03-11 01:56:01.301944254 +0000 UTC m=+3718.106161899" lastFinishedPulling="2026-03-11 01:56:02.404442307 +0000 UTC m=+3719.208659922" observedRunningTime="2026-03-11 01:56:02.816100576 +0000 UTC m=+3719.620318191" watchObservedRunningTime="2026-03-11 01:56:02.819607427 +0000 UTC m=+3719.623825112" Mar 11 01:56:03 crc kubenswrapper[4744]: I0311 01:56:03.802318 4744 generic.go:334] "Generic (PLEG): container finished" podID="dae30bbc-1daa-41bd-949c-b97e8db8a318" containerID="e59c443f7b4a50658d313f92f9775cbd0d9466a72efdb2344e4a466534f6788c" exitCode=0 Mar 11 01:56:03 crc kubenswrapper[4744]: I0311 01:56:03.802397 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553236-74j6k" event={"ID":"dae30bbc-1daa-41bd-949c-b97e8db8a318","Type":"ContainerDied","Data":"e59c443f7b4a50658d313f92f9775cbd0d9466a72efdb2344e4a466534f6788c"} Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.188528 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.338544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pmd4\" (UniqueName: \"kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4\") pod \"dae30bbc-1daa-41bd-949c-b97e8db8a318\" (UID: \"dae30bbc-1daa-41bd-949c-b97e8db8a318\") " Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.346328 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4" (OuterVolumeSpecName: "kube-api-access-8pmd4") pod "dae30bbc-1daa-41bd-949c-b97e8db8a318" (UID: "dae30bbc-1daa-41bd-949c-b97e8db8a318"). InnerVolumeSpecName "kube-api-access-8pmd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.440033 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pmd4\" (UniqueName: \"kubernetes.io/projected/dae30bbc-1daa-41bd-949c-b97e8db8a318-kube-api-access-8pmd4\") on node \"crc\" DevicePath \"\"" Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.823373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553236-74j6k" event={"ID":"dae30bbc-1daa-41bd-949c-b97e8db8a318","Type":"ContainerDied","Data":"83b77259326aba2fe16d471726573bb88428645e1d16101480a93128e55cba2c"} Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.823432 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b77259326aba2fe16d471726573bb88428645e1d16101480a93128e55cba2c" Mar 11 01:56:05 crc kubenswrapper[4744]: I0311 01:56:05.823441 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553236-74j6k" Mar 11 01:56:06 crc kubenswrapper[4744]: I0311 01:56:06.656770 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553230-7bp8h"] Mar 11 01:56:06 crc kubenswrapper[4744]: I0311 01:56:06.661014 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553230-7bp8h"] Mar 11 01:56:07 crc kubenswrapper[4744]: I0311 01:56:07.990083 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4192f44-4f7d-4eeb-8993-7acda73ff091" path="/var/lib/kubelet/pods/b4192f44-4f7d-4eeb-8993-7acda73ff091/volumes" Mar 11 01:56:08 crc kubenswrapper[4744]: I0311 01:56:08.975070 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:56:08 crc kubenswrapper[4744]: E0311 01:56:08.975836 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:56:13 crc kubenswrapper[4744]: I0311 01:56:13.395795 4744 scope.go:117] "RemoveContainer" containerID="5b93e13a3f00d4c69fac3858533b230fbde4cf10da2e3d15bd9120b55382e0a5" Mar 11 01:56:20 crc kubenswrapper[4744]: I0311 01:56:20.975262 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:56:20 crc kubenswrapper[4744]: E0311 01:56:20.976089 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:56:32 crc kubenswrapper[4744]: I0311 01:56:32.975780 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:56:32 crc kubenswrapper[4744]: E0311 01:56:32.977454 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:56:44 crc kubenswrapper[4744]: I0311 01:56:44.974847 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:56:44 crc kubenswrapper[4744]: E0311 01:56:44.975502 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:56:59 crc kubenswrapper[4744]: I0311 01:56:59.975370 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:56:59 crc kubenswrapper[4744]: E0311 01:56:59.976092 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:57:13 crc kubenswrapper[4744]: I0311 01:57:13.980699 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:57:13 crc kubenswrapper[4744]: E0311 01:57:13.981697 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:57:27 crc kubenswrapper[4744]: I0311 01:57:27.975126 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:57:27 crc kubenswrapper[4744]: E0311 01:57:27.977910 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:57:42 crc kubenswrapper[4744]: I0311 01:57:42.974737 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:57:42 crc kubenswrapper[4744]: E0311 01:57:42.975893 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:57:56 crc kubenswrapper[4744]: I0311 01:57:56.975252 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:57:56 crc kubenswrapper[4744]: E0311 01:57:56.976172 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.169549 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553238-662x9"] Mar 11 01:58:00 crc kubenswrapper[4744]: E0311 01:58:00.170236 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae30bbc-1daa-41bd-949c-b97e8db8a318" containerName="oc" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.170265 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae30bbc-1daa-41bd-949c-b97e8db8a318" containerName="oc" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.170684 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae30bbc-1daa-41bd-949c-b97e8db8a318" containerName="oc" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.171732 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.175344 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.175476 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.175875 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.193598 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553238-662x9"] Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.338049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56nt\" (UniqueName: \"kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt\") pod \"auto-csr-approver-29553238-662x9\" (UID: \"de884a1e-19e9-49bd-b048-95dd915f6c51\") " pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.441425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56nt\" (UniqueName: \"kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt\") pod \"auto-csr-approver-29553238-662x9\" (UID: \"de884a1e-19e9-49bd-b048-95dd915f6c51\") " pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.480795 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56nt\" (UniqueName: \"kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt\") pod \"auto-csr-approver-29553238-662x9\" (UID: \"de884a1e-19e9-49bd-b048-95dd915f6c51\") " pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.507204 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.774745 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553238-662x9"] Mar 11 01:58:00 crc kubenswrapper[4744]: W0311 01:58:00.784716 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde884a1e_19e9_49bd_b048_95dd915f6c51.slice/crio-c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa WatchSource:0}: Error finding container c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa: Status 404 returned error can't find the container with id c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa Mar 11 01:58:00 crc kubenswrapper[4744]: I0311 01:58:00.864616 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553238-662x9" event={"ID":"de884a1e-19e9-49bd-b048-95dd915f6c51","Type":"ContainerStarted","Data":"c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa"} Mar 11 01:58:02 crc kubenswrapper[4744]: I0311 01:58:02.885166 4744 generic.go:334] "Generic (PLEG): container finished" podID="de884a1e-19e9-49bd-b048-95dd915f6c51" containerID="e827d36cee92756f0c25367a0b39828d5d2770053e936ed7a5cadc9bb8e43249" exitCode=0 Mar 11 01:58:02 crc kubenswrapper[4744]: I0311 01:58:02.885249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553238-662x9" event={"ID":"de884a1e-19e9-49bd-b048-95dd915f6c51","Type":"ContainerDied","Data":"e827d36cee92756f0c25367a0b39828d5d2770053e936ed7a5cadc9bb8e43249"} Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.268838 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.400531 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56nt\" (UniqueName: \"kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt\") pod \"de884a1e-19e9-49bd-b048-95dd915f6c51\" (UID: \"de884a1e-19e9-49bd-b048-95dd915f6c51\") " Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.408742 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt" (OuterVolumeSpecName: "kube-api-access-d56nt") pod "de884a1e-19e9-49bd-b048-95dd915f6c51" (UID: "de884a1e-19e9-49bd-b048-95dd915f6c51"). InnerVolumeSpecName "kube-api-access-d56nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.502377 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d56nt\" (UniqueName: \"kubernetes.io/projected/de884a1e-19e9-49bd-b048-95dd915f6c51-kube-api-access-d56nt\") on node \"crc\" DevicePath \"\"" Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.902085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553238-662x9" event={"ID":"de884a1e-19e9-49bd-b048-95dd915f6c51","Type":"ContainerDied","Data":"c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa"} Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.902134 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3ac3822c8249fb2b711ca8bb286d3942997a20407b6c4b976ef5c465eb11afa" Mar 11 01:58:04 crc kubenswrapper[4744]: I0311 01:58:04.902150 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553238-662x9" Mar 11 01:58:05 crc kubenswrapper[4744]: I0311 01:58:05.366685 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553232-jlqk4"] Mar 11 01:58:05 crc kubenswrapper[4744]: I0311 01:58:05.376033 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553232-jlqk4"] Mar 11 01:58:05 crc kubenswrapper[4744]: I0311 01:58:05.989499 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e20982c-bc31-4d0a-b891-70bc79483ecf" path="/var/lib/kubelet/pods/8e20982c-bc31-4d0a-b891-70bc79483ecf/volumes" Mar 11 01:58:08 crc kubenswrapper[4744]: I0311 01:58:08.974414 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:58:08 crc kubenswrapper[4744]: E0311 01:58:08.974883 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:58:13 crc kubenswrapper[4744]: I0311 01:58:13.518287 4744 scope.go:117] "RemoveContainer" containerID="170b009a918db3c665e0bb2a8e364d49010c87a04e005840051d7a7a56b82ec4" Mar 11 01:58:19 crc kubenswrapper[4744]: I0311 01:58:19.975057 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:58:19 crc kubenswrapper[4744]: E0311 01:58:19.976496 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:58:30 crc kubenswrapper[4744]: I0311 01:58:30.974968 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:58:30 crc kubenswrapper[4744]: E0311 01:58:30.976096 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:58:45 crc kubenswrapper[4744]: I0311 01:58:45.975229 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:58:45 crc kubenswrapper[4744]: E0311 01:58:45.976239 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:58:59 crc kubenswrapper[4744]: I0311 01:58:59.975896 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:58:59 crc kubenswrapper[4744]: E0311 01:58:59.976854 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:59:13 crc kubenswrapper[4744]: I0311 01:59:13.982648 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:59:13 crc kubenswrapper[4744]: E0311 01:59:13.985347 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:59:26 crc kubenswrapper[4744]: I0311 01:59:26.975579 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:59:26 crc kubenswrapper[4744]: E0311 01:59:26.976557 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:59:37 crc kubenswrapper[4744]: I0311 01:59:37.975414 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:59:37 crc kubenswrapper[4744]: E0311 01:59:37.976448 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 01:59:52 crc kubenswrapper[4744]: I0311 01:59:52.975323 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 01:59:53 crc kubenswrapper[4744]: I0311 01:59:53.900497 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3"} Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.166537 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553240-qfzrv"] Mar 11 02:00:00 crc kubenswrapper[4744]: E0311 02:00:00.168481 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de884a1e-19e9-49bd-b048-95dd915f6c51" containerName="oc" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.168504 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="de884a1e-19e9-49bd-b048-95dd915f6c51" containerName="oc" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.168769 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="de884a1e-19e9-49bd-b048-95dd915f6c51" containerName="oc" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.169475 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.175624 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.176651 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.177402 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.185331 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p"] Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.187174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.191842 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.192683 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.200784 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553240-qfzrv"] Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.211951 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p"] Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.311957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.312030 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nnw\" (UniqueName: \"kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.312124 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rjmk\" (UniqueName: \"kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk\") pod \"auto-csr-approver-29553240-qfzrv\" (UID: \"d5295248-4dc5-4d74-b195-aef107864ca5\") " pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.312175 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.413947 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.414019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nnw\" (UniqueName: \"kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.414073 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rjmk\" (UniqueName: \"kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk\") pod \"auto-csr-approver-29553240-qfzrv\" (UID: \"d5295248-4dc5-4d74-b195-aef107864ca5\") " pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.414113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.416935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.422451 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.448129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rjmk\" (UniqueName: \"kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk\") pod \"auto-csr-approver-29553240-qfzrv\" (UID: \"d5295248-4dc5-4d74-b195-aef107864ca5\") " pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.450022 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nnw\" (UniqueName: \"kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw\") pod \"collect-profiles-29553240-xqm9p\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.501549 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.529737 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:00 crc kubenswrapper[4744]: I0311 02:00:00.978575 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553240-qfzrv"] Mar 11 02:00:00 crc kubenswrapper[4744]: W0311 02:00:00.984408 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5295248_4dc5_4d74_b195_aef107864ca5.slice/crio-a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1 WatchSource:0}: Error finding container a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1: Status 404 returned error can't find the container with id a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1 Mar 11 02:00:01 crc kubenswrapper[4744]: I0311 02:00:01.063625 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p"] Mar 11 02:00:01 crc kubenswrapper[4744]: W0311 02:00:01.066474 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e7e1390_6f32_40d5_adec_3769768dae25.slice/crio-694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75 WatchSource:0}: Error finding container 694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75: Status 404 returned error can't find the container with id 694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75 Mar 11 02:00:01 crc kubenswrapper[4744]: I0311 02:00:01.986782 4744 generic.go:334] "Generic (PLEG): container finished" podID="9e7e1390-6f32-40d5-adec-3769768dae25" containerID="4ba9951769dc9cbd8f5ca426397018c1fd0f022b4f623255d5d0354ce3981c5b" exitCode=0 Mar 11 02:00:01 crc kubenswrapper[4744]: I0311 02:00:01.992239 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" event={"ID":"9e7e1390-6f32-40d5-adec-3769768dae25","Type":"ContainerDied","Data":"4ba9951769dc9cbd8f5ca426397018c1fd0f022b4f623255d5d0354ce3981c5b"} Mar 11 02:00:01 crc kubenswrapper[4744]: I0311 02:00:01.992322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" event={"ID":"9e7e1390-6f32-40d5-adec-3769768dae25","Type":"ContainerStarted","Data":"694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75"} Mar 11 02:00:01 crc kubenswrapper[4744]: I0311 02:00:01.992354 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" event={"ID":"d5295248-4dc5-4d74-b195-aef107864ca5","Type":"ContainerStarted","Data":"a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1"} Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.556781 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.661473 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume\") pod \"9e7e1390-6f32-40d5-adec-3769768dae25\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.661662 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4nnw\" (UniqueName: \"kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw\") pod \"9e7e1390-6f32-40d5-adec-3769768dae25\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.661748 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume\") pod \"9e7e1390-6f32-40d5-adec-3769768dae25\" (UID: \"9e7e1390-6f32-40d5-adec-3769768dae25\") " Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.662588 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e7e1390-6f32-40d5-adec-3769768dae25" (UID: "9e7e1390-6f32-40d5-adec-3769768dae25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.667608 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw" (OuterVolumeSpecName: "kube-api-access-c4nnw") pod "9e7e1390-6f32-40d5-adec-3769768dae25" (UID: "9e7e1390-6f32-40d5-adec-3769768dae25"). InnerVolumeSpecName "kube-api-access-c4nnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.668069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e7e1390-6f32-40d5-adec-3769768dae25" (UID: "9e7e1390-6f32-40d5-adec-3769768dae25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.763234 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e7e1390-6f32-40d5-adec-3769768dae25-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.763287 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4nnw\" (UniqueName: \"kubernetes.io/projected/9e7e1390-6f32-40d5-adec-3769768dae25-kube-api-access-c4nnw\") on node \"crc\" DevicePath \"\"" Mar 11 02:00:03 crc kubenswrapper[4744]: I0311 02:00:03.763307 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e7e1390-6f32-40d5-adec-3769768dae25-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:00:04 crc kubenswrapper[4744]: I0311 02:00:04.010400 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" event={"ID":"9e7e1390-6f32-40d5-adec-3769768dae25","Type":"ContainerDied","Data":"694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75"} Mar 11 02:00:04 crc kubenswrapper[4744]: I0311 02:00:04.010457 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694945aae92224b157d67d0773d34cf186d92acd2dea226f0840563569d74f75" Mar 11 02:00:04 crc kubenswrapper[4744]: I0311 02:00:04.010472 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p" Mar 11 02:00:04 crc kubenswrapper[4744]: I0311 02:00:04.653984 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk"] Mar 11 02:00:04 crc kubenswrapper[4744]: I0311 02:00:04.663050 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553195-rs6nk"] Mar 11 02:00:05 crc kubenswrapper[4744]: I0311 02:00:05.020366 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" event={"ID":"d5295248-4dc5-4d74-b195-aef107864ca5","Type":"ContainerStarted","Data":"ff52bdb2a1a3df95d4a31c7119c69c9c1029cb28353134523ad21090eb1558a4"} Mar 11 02:00:05 crc kubenswrapper[4744]: I0311 02:00:05.034002 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" podStartSLOduration=1.395598387 podStartE2EDuration="5.033978026s" podCreationTimestamp="2026-03-11 02:00:00 +0000 UTC" firstStartedPulling="2026-03-11 02:00:00.986051425 +0000 UTC m=+3957.790269030" lastFinishedPulling="2026-03-11 02:00:04.624431024 +0000 UTC m=+3961.428648669" observedRunningTime="2026-03-11 02:00:05.032722636 +0000 UTC m=+3961.836940271" watchObservedRunningTime="2026-03-11 02:00:05.033978026 +0000 UTC m=+3961.838195671" Mar 11 02:00:05 crc kubenswrapper[4744]: I0311 02:00:05.992007 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab006bc8-78da-42a8-9322-f52588f20622" path="/var/lib/kubelet/pods/ab006bc8-78da-42a8-9322-f52588f20622/volumes" Mar 11 02:00:06 crc kubenswrapper[4744]: I0311 02:00:06.031475 4744 generic.go:334] "Generic (PLEG): container finished" podID="d5295248-4dc5-4d74-b195-aef107864ca5" containerID="ff52bdb2a1a3df95d4a31c7119c69c9c1029cb28353134523ad21090eb1558a4" exitCode=0 Mar 11 02:00:06 crc kubenswrapper[4744]: I0311 02:00:06.031591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" event={"ID":"d5295248-4dc5-4d74-b195-aef107864ca5","Type":"ContainerDied","Data":"ff52bdb2a1a3df95d4a31c7119c69c9c1029cb28353134523ad21090eb1558a4"} Mar 11 02:00:07 crc kubenswrapper[4744]: I0311 02:00:07.471448 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:07 crc kubenswrapper[4744]: I0311 02:00:07.667101 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rjmk\" (UniqueName: \"kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk\") pod \"d5295248-4dc5-4d74-b195-aef107864ca5\" (UID: \"d5295248-4dc5-4d74-b195-aef107864ca5\") " Mar 11 02:00:07 crc kubenswrapper[4744]: I0311 02:00:07.676962 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk" (OuterVolumeSpecName: "kube-api-access-2rjmk") pod "d5295248-4dc5-4d74-b195-aef107864ca5" (UID: "d5295248-4dc5-4d74-b195-aef107864ca5"). InnerVolumeSpecName "kube-api-access-2rjmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:00:07 crc kubenswrapper[4744]: I0311 02:00:07.768839 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rjmk\" (UniqueName: \"kubernetes.io/projected/d5295248-4dc5-4d74-b195-aef107864ca5-kube-api-access-2rjmk\") on node \"crc\" DevicePath \"\"" Mar 11 02:00:08 crc kubenswrapper[4744]: I0311 02:00:08.081788 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" event={"ID":"d5295248-4dc5-4d74-b195-aef107864ca5","Type":"ContainerDied","Data":"a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1"} Mar 11 02:00:08 crc kubenswrapper[4744]: I0311 02:00:08.081862 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a240ca9667ee23a59b4229296fae61f9aa5263453a12c00eb3814e9333bb85d1" Mar 11 02:00:08 crc kubenswrapper[4744]: I0311 02:00:08.082005 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553240-qfzrv" Mar 11 02:00:08 crc kubenswrapper[4744]: I0311 02:00:08.116303 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553234-929lb"] Mar 11 02:00:08 crc kubenswrapper[4744]: I0311 02:00:08.126733 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553234-929lb"] Mar 11 02:00:09 crc kubenswrapper[4744]: I0311 02:00:09.991858 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf" path="/var/lib/kubelet/pods/9ad08b6d-bbfa-4608-94bd-5fc70b2e6eaf/volumes" Mar 11 02:00:13 crc kubenswrapper[4744]: I0311 02:00:13.629909 4744 scope.go:117] "RemoveContainer" containerID="cdbd07cc878eb88c67ce05d3b1694154a3da2389f594ee97e43fe4ee869192d7" Mar 11 02:00:13 crc kubenswrapper[4744]: I0311 02:00:13.662912 4744 scope.go:117] "RemoveContainer" containerID="113cf4ba5f83f3948b6cb7050999ae5902ca3d99efb39c19ee4499dfcea59e6b" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.169618 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553242-c9ggj"] Mar 11 02:02:00 crc kubenswrapper[4744]: E0311 02:02:00.171045 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5295248-4dc5-4d74-b195-aef107864ca5" containerName="oc" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.171079 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5295248-4dc5-4d74-b195-aef107864ca5" containerName="oc" Mar 11 02:02:00 crc kubenswrapper[4744]: E0311 02:02:00.171112 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e1390-6f32-40d5-adec-3769768dae25" containerName="collect-profiles" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.171128 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e1390-6f32-40d5-adec-3769768dae25" containerName="collect-profiles" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.171739 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5295248-4dc5-4d74-b195-aef107864ca5" containerName="oc" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.171787 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e1390-6f32-40d5-adec-3769768dae25" containerName="collect-profiles" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.172776 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.177322 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.177537 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.180358 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.182231 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553242-c9ggj"] Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.323293 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzrdt\" (UniqueName: \"kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt\") pod \"auto-csr-approver-29553242-c9ggj\" (UID: \"e6abb1e5-6401-4c2e-862f-7762ed28775e\") " pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.425267 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzrdt\" (UniqueName: \"kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt\") pod \"auto-csr-approver-29553242-c9ggj\" (UID: \"e6abb1e5-6401-4c2e-862f-7762ed28775e\") " pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.463934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzrdt\" (UniqueName: \"kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt\") pod \"auto-csr-approver-29553242-c9ggj\" (UID: \"e6abb1e5-6401-4c2e-862f-7762ed28775e\") " pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.511315 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.840676 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553242-c9ggj"] Mar 11 02:02:00 crc kubenswrapper[4744]: W0311 02:02:00.841756 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6abb1e5_6401_4c2e_862f_7762ed28775e.slice/crio-8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b WatchSource:0}: Error finding container 8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b: Status 404 returned error can't find the container with id 8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b Mar 11 02:02:00 crc kubenswrapper[4744]: I0311 02:02:00.844442 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:02:01 crc kubenswrapper[4744]: I0311 02:02:01.106066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" event={"ID":"e6abb1e5-6401-4c2e-862f-7762ed28775e","Type":"ContainerStarted","Data":"8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b"} Mar 11 02:02:03 crc kubenswrapper[4744]: I0311 02:02:03.126016 4744 generic.go:334] "Generic (PLEG): container finished" podID="e6abb1e5-6401-4c2e-862f-7762ed28775e" containerID="69ad3b48812eee0d5e9f23e72288bd4ed9437e16e887d68dc7fb0e1ccc69603a" exitCode=0 Mar 11 02:02:03 crc kubenswrapper[4744]: I0311 02:02:03.126139 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" event={"ID":"e6abb1e5-6401-4c2e-862f-7762ed28775e","Type":"ContainerDied","Data":"69ad3b48812eee0d5e9f23e72288bd4ed9437e16e887d68dc7fb0e1ccc69603a"} Mar 11 02:02:04 crc kubenswrapper[4744]: I0311 02:02:04.442458 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:04 crc kubenswrapper[4744]: I0311 02:02:04.601240 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzrdt\" (UniqueName: \"kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt\") pod \"e6abb1e5-6401-4c2e-862f-7762ed28775e\" (UID: \"e6abb1e5-6401-4c2e-862f-7762ed28775e\") " Mar 11 02:02:04 crc kubenswrapper[4744]: I0311 02:02:04.605918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt" (OuterVolumeSpecName: "kube-api-access-kzrdt") pod "e6abb1e5-6401-4c2e-862f-7762ed28775e" (UID: "e6abb1e5-6401-4c2e-862f-7762ed28775e"). InnerVolumeSpecName "kube-api-access-kzrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:02:04 crc kubenswrapper[4744]: I0311 02:02:04.704219 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzrdt\" (UniqueName: \"kubernetes.io/projected/e6abb1e5-6401-4c2e-862f-7762ed28775e-kube-api-access-kzrdt\") on node \"crc\" DevicePath \"\"" Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.145930 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.145945 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553242-c9ggj" event={"ID":"e6abb1e5-6401-4c2e-862f-7762ed28775e","Type":"ContainerDied","Data":"8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b"} Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.146127 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b86c40bccb8a5330dda037e0169f28c0b7bb1b13bf0fe331ae4db16f7b6eb3b" Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.529217 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553236-74j6k"] Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.536352 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553236-74j6k"] Mar 11 02:02:05 crc kubenswrapper[4744]: I0311 02:02:05.990988 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae30bbc-1daa-41bd-949c-b97e8db8a318" path="/var/lib/kubelet/pods/dae30bbc-1daa-41bd-949c-b97e8db8a318/volumes" Mar 11 02:02:12 crc kubenswrapper[4744]: I0311 02:02:12.409241 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:02:12 crc kubenswrapper[4744]: I0311 02:02:12.409710 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:02:13 crc kubenswrapper[4744]: I0311 02:02:13.840027 4744 scope.go:117] "RemoveContainer" containerID="e59c443f7b4a50658d313f92f9775cbd0d9466a72efdb2344e4a466534f6788c" Mar 11 02:02:42 crc kubenswrapper[4744]: I0311 02:02:42.409665 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:02:42 crc kubenswrapper[4744]: I0311 02:02:42.410324 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:03:11 crc kubenswrapper[4744]: I0311 02:03:11.998970 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:12 crc kubenswrapper[4744]: E0311 02:03:12.000311 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6abb1e5-6401-4c2e-862f-7762ed28775e" containerName="oc" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.000345 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6abb1e5-6401-4c2e-862f-7762ed28775e" containerName="oc" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.000732 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6abb1e5-6401-4c2e-862f-7762ed28775e" containerName="oc" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.002702 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.009774 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.124344 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.124614 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.124775 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jf5k\" (UniqueName: \"kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.226652 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.226729 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jf5k\" (UniqueName: \"kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.226804 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.227299 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.227370 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.258388 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jf5k\" (UniqueName: \"kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k\") pod \"redhat-marketplace-2tjtk\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.339578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.409838 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.409930 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.410002 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.411031 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.411132 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3" gracePeriod=600 Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.761147 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3" exitCode=0 Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.761601 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3"} Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.761762 4744 scope.go:117] "RemoveContainer" containerID="be3f9ef48ec9e492a2adef204936907fa56b289e46f531f87079a08adff341c6" Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.762466 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921"} Mar 11 02:03:12 crc kubenswrapper[4744]: I0311 02:03:12.920804 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:12 crc kubenswrapper[4744]: W0311 02:03:12.922195 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d9aac08_d760_4bee_971c_fe5be9f20063.slice/crio-c90f7e4fb9b277b81bea22bd80a39c7aa292b5e065d58ad8d365980a42fa4993 WatchSource:0}: Error finding container c90f7e4fb9b277b81bea22bd80a39c7aa292b5e065d58ad8d365980a42fa4993: Status 404 returned error can't find the container with id c90f7e4fb9b277b81bea22bd80a39c7aa292b5e065d58ad8d365980a42fa4993 Mar 11 02:03:13 crc kubenswrapper[4744]: I0311 02:03:13.778885 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerID="c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23" exitCode=0 Mar 11 02:03:13 crc kubenswrapper[4744]: I0311 02:03:13.779121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerDied","Data":"c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23"} Mar 11 02:03:13 crc kubenswrapper[4744]: I0311 02:03:13.779288 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerStarted","Data":"c90f7e4fb9b277b81bea22bd80a39c7aa292b5e065d58ad8d365980a42fa4993"} Mar 11 02:03:15 crc kubenswrapper[4744]: I0311 02:03:15.798121 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerID="1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324" exitCode=0 Mar 11 02:03:15 crc kubenswrapper[4744]: I0311 02:03:15.798240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerDied","Data":"1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324"} Mar 11 02:03:16 crc kubenswrapper[4744]: I0311 02:03:16.812716 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerStarted","Data":"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0"} Mar 11 02:03:16 crc kubenswrapper[4744]: I0311 02:03:16.850693 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tjtk" podStartSLOduration=3.423694556 podStartE2EDuration="5.850668005s" podCreationTimestamp="2026-03-11 02:03:11 +0000 UTC" firstStartedPulling="2026-03-11 02:03:13.78158179 +0000 UTC m=+4150.585799435" lastFinishedPulling="2026-03-11 02:03:16.208555249 +0000 UTC m=+4153.012772884" observedRunningTime="2026-03-11 02:03:16.839752358 +0000 UTC m=+4153.643970003" watchObservedRunningTime="2026-03-11 02:03:16.850668005 +0000 UTC m=+4153.654885650" Mar 11 02:03:22 crc kubenswrapper[4744]: I0311 02:03:22.340618 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:22 crc kubenswrapper[4744]: I0311 02:03:22.341293 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:22 crc kubenswrapper[4744]: I0311 02:03:22.424988 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:22 crc kubenswrapper[4744]: I0311 02:03:22.957007 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:23 crc kubenswrapper[4744]: I0311 02:03:23.022140 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:24 crc kubenswrapper[4744]: I0311 02:03:24.895498 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tjtk" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="registry-server" containerID="cri-o://19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0" gracePeriod=2 Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.376005 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.447202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jf5k\" (UniqueName: \"kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k\") pod \"9d9aac08-d760-4bee-971c-fe5be9f20063\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.447307 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content\") pod \"9d9aac08-d760-4bee-971c-fe5be9f20063\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.447367 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities\") pod \"9d9aac08-d760-4bee-971c-fe5be9f20063\" (UID: \"9d9aac08-d760-4bee-971c-fe5be9f20063\") " Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.448414 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities" (OuterVolumeSpecName: "utilities") pod "9d9aac08-d760-4bee-971c-fe5be9f20063" (UID: "9d9aac08-d760-4bee-971c-fe5be9f20063"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.457034 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k" (OuterVolumeSpecName: "kube-api-access-6jf5k") pod "9d9aac08-d760-4bee-971c-fe5be9f20063" (UID: "9d9aac08-d760-4bee-971c-fe5be9f20063"). InnerVolumeSpecName "kube-api-access-6jf5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.485444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d9aac08-d760-4bee-971c-fe5be9f20063" (UID: "9d9aac08-d760-4bee-971c-fe5be9f20063"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.549362 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jf5k\" (UniqueName: \"kubernetes.io/projected/9d9aac08-d760-4bee-971c-fe5be9f20063-kube-api-access-6jf5k\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.549389 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.549400 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d9aac08-d760-4bee-971c-fe5be9f20063-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.907305 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerID="19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0" exitCode=0 Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.907366 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tjtk" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.907382 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerDied","Data":"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0"} Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.907599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tjtk" event={"ID":"9d9aac08-d760-4bee-971c-fe5be9f20063","Type":"ContainerDied","Data":"c90f7e4fb9b277b81bea22bd80a39c7aa292b5e065d58ad8d365980a42fa4993"} Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.907631 4744 scope.go:117] "RemoveContainer" containerID="19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.935246 4744 scope.go:117] "RemoveContainer" containerID="1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.969147 4744 scope.go:117] "RemoveContainer" containerID="c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23" Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.971222 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:25 crc kubenswrapper[4744]: I0311 02:03:25.997831 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tjtk"] Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.011246 4744 scope.go:117] "RemoveContainer" containerID="19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0" Mar 11 02:03:26 crc kubenswrapper[4744]: E0311 02:03:26.014101 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0\": container with ID starting with 19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0 not found: ID does not exist" containerID="19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0" Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.014190 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0"} err="failed to get container status \"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0\": rpc error: code = NotFound desc = could not find container \"19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0\": container with ID starting with 19776d057c0721e162c524ee2371892003f38282df4fe4abdee92944ddaca4f0 not found: ID does not exist" Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.014219 4744 scope.go:117] "RemoveContainer" containerID="1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324" Mar 11 02:03:26 crc kubenswrapper[4744]: E0311 02:03:26.015727 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324\": container with ID starting with 1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324 not found: ID does not exist" containerID="1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324" Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.015775 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324"} err="failed to get container status \"1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324\": rpc error: code = NotFound desc = could not find container \"1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324\": container with ID starting with 1dda5bdbd9b904ee8fef0d6717c85d9823ed310af07f11644f0e4024bd6a8324 not found: ID does not exist" Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.015809 4744 scope.go:117] "RemoveContainer" containerID="c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23" Mar 11 02:03:26 crc kubenswrapper[4744]: E0311 02:03:26.016215 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23\": container with ID starting with c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23 not found: ID does not exist" containerID="c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23" Mar 11 02:03:26 crc kubenswrapper[4744]: I0311 02:03:26.016256 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23"} err="failed to get container status \"c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23\": rpc error: code = NotFound desc = could not find container \"c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23\": container with ID starting with c8d7ee57e2d77a14cbdaed0285525685b647d2f026df3790505003324f19be23 not found: ID does not exist" Mar 11 02:03:27 crc kubenswrapper[4744]: I0311 02:03:27.989583 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" path="/var/lib/kubelet/pods/9d9aac08-d760-4bee-971c-fe5be9f20063/volumes" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.669473 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:39 crc kubenswrapper[4744]: E0311 02:03:39.670650 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="extract-utilities" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.670673 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="extract-utilities" Mar 11 02:03:39 crc kubenswrapper[4744]: E0311 02:03:39.670719 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="registry-server" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.670731 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="registry-server" Mar 11 02:03:39 crc kubenswrapper[4744]: E0311 02:03:39.670755 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="extract-content" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.670769 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="extract-content" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.671005 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9aac08-d760-4bee-971c-fe5be9f20063" containerName="registry-server" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.673264 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.688968 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.775141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.775235 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hdd\" (UniqueName: \"kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.775280 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.876672 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.876769 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hdd\" (UniqueName: \"kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.876821 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.877307 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.877644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:39 crc kubenswrapper[4744]: I0311 02:03:39.910888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hdd\" (UniqueName: \"kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd\") pod \"certified-operators-56ldm\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:40 crc kubenswrapper[4744]: I0311 02:03:40.001397 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:40 crc kubenswrapper[4744]: I0311 02:03:40.339887 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:41 crc kubenswrapper[4744]: I0311 02:03:41.097303 4744 generic.go:334] "Generic (PLEG): container finished" podID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerID="b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872" exitCode=0 Mar 11 02:03:41 crc kubenswrapper[4744]: I0311 02:03:41.097363 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerDied","Data":"b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872"} Mar 11 02:03:41 crc kubenswrapper[4744]: I0311 02:03:41.097399 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerStarted","Data":"c23895e758481057eccaf24a534caffa346a37354bbf54654d743fc23ac36347"} Mar 11 02:03:42 crc kubenswrapper[4744]: I0311 02:03:42.109383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerStarted","Data":"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97"} Mar 11 02:03:43 crc kubenswrapper[4744]: I0311 02:03:43.124242 4744 generic.go:334] "Generic (PLEG): container finished" podID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerID="188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97" exitCode=0 Mar 11 02:03:43 crc kubenswrapper[4744]: I0311 02:03:43.124359 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerDied","Data":"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97"} Mar 11 02:03:44 crc kubenswrapper[4744]: I0311 02:03:44.137208 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerStarted","Data":"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f"} Mar 11 02:03:44 crc kubenswrapper[4744]: I0311 02:03:44.169594 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-56ldm" podStartSLOduration=2.671758224 podStartE2EDuration="5.169570435s" podCreationTimestamp="2026-03-11 02:03:39 +0000 UTC" firstStartedPulling="2026-03-11 02:03:41.099314464 +0000 UTC m=+4177.903532099" lastFinishedPulling="2026-03-11 02:03:43.597126675 +0000 UTC m=+4180.401344310" observedRunningTime="2026-03-11 02:03:44.166263533 +0000 UTC m=+4180.970481168" watchObservedRunningTime="2026-03-11 02:03:44.169570435 +0000 UTC m=+4180.973788070" Mar 11 02:03:50 crc kubenswrapper[4744]: I0311 02:03:50.002692 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:50 crc kubenswrapper[4744]: I0311 02:03:50.003052 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:50 crc kubenswrapper[4744]: I0311 02:03:50.084325 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:50 crc kubenswrapper[4744]: I0311 02:03:50.262347 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:50 crc kubenswrapper[4744]: I0311 02:03:50.335953 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.203783 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-56ldm" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="registry-server" containerID="cri-o://11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f" gracePeriod=2 Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.667347 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.686708 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5hdd\" (UniqueName: \"kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd\") pod \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.686839 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content\") pod \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.687050 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities\") pod \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\" (UID: \"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4\") " Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.688771 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities" (OuterVolumeSpecName: "utilities") pod "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" (UID: "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.697642 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd" (OuterVolumeSpecName: "kube-api-access-f5hdd") pod "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" (UID: "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4"). InnerVolumeSpecName "kube-api-access-f5hdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.788908 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.788957 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5hdd\" (UniqueName: \"kubernetes.io/projected/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-kube-api-access-f5hdd\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.971286 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" (UID: "2daea8ea-8a33-4f1a-84d4-02823bd2bbc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:03:52 crc kubenswrapper[4744]: I0311 02:03:52.991488 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.215251 4744 generic.go:334] "Generic (PLEG): container finished" podID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerID="11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f" exitCode=0 Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.215300 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerDied","Data":"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f"} Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.215342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56ldm" event={"ID":"2daea8ea-8a33-4f1a-84d4-02823bd2bbc4","Type":"ContainerDied","Data":"c23895e758481057eccaf24a534caffa346a37354bbf54654d743fc23ac36347"} Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.215351 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56ldm" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.215386 4744 scope.go:117] "RemoveContainer" containerID="11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.251214 4744 scope.go:117] "RemoveContainer" containerID="188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.293530 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.303208 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-56ldm"] Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.309456 4744 scope.go:117] "RemoveContainer" containerID="b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.335791 4744 scope.go:117] "RemoveContainer" containerID="11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f" Mar 11 02:03:53 crc kubenswrapper[4744]: E0311 02:03:53.336571 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f\": container with ID starting with 11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f not found: ID does not exist" containerID="11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.336610 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f"} err="failed to get container status \"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f\": rpc error: code = NotFound desc = could not find container \"11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f\": container with ID starting with 11eadc36b6621f014f557a796eda6fcd38b316e7a0b00933fc3dd833443f203f not found: ID does not exist" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.336635 4744 scope.go:117] "RemoveContainer" containerID="188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97" Mar 11 02:03:53 crc kubenswrapper[4744]: E0311 02:03:53.337071 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97\": container with ID starting with 188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97 not found: ID does not exist" containerID="188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.337141 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97"} err="failed to get container status \"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97\": rpc error: code = NotFound desc = could not find container \"188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97\": container with ID starting with 188a17d852a02824e4733b9f5b59acbbc2f010a2dccacc55089d297c58d7dd97 not found: ID does not exist" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.337186 4744 scope.go:117] "RemoveContainer" containerID="b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872" Mar 11 02:03:53 crc kubenswrapper[4744]: E0311 02:03:53.337693 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872\": container with ID starting with b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872 not found: ID does not exist" containerID="b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.337718 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872"} err="failed to get container status \"b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872\": rpc error: code = NotFound desc = could not find container \"b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872\": container with ID starting with b6428215a46fe224370b51add7a3ff4259fd09eda4766cd21271297e22228872 not found: ID does not exist" Mar 11 02:03:53 crc kubenswrapper[4744]: I0311 02:03:53.990286 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" path="/var/lib/kubelet/pods/2daea8ea-8a33-4f1a-84d4-02823bd2bbc4/volumes" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.169873 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553244-s2hjr"] Mar 11 02:04:00 crc kubenswrapper[4744]: E0311 02:04:00.170953 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="extract-content" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.170986 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="extract-content" Mar 11 02:04:00 crc kubenswrapper[4744]: E0311 02:04:00.171020 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="registry-server" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.171031 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="registry-server" Mar 11 02:04:00 crc kubenswrapper[4744]: E0311 02:04:00.171047 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="extract-utilities" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.171056 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="extract-utilities" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.171247 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2daea8ea-8a33-4f1a-84d4-02823bd2bbc4" containerName="registry-server" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.171918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.175190 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.175238 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.175552 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.206588 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553244-s2hjr"] Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.308919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtgb\" (UniqueName: \"kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb\") pod \"auto-csr-approver-29553244-s2hjr\" (UID: \"efeef355-d6c4-454e-ab54-ed70864c4866\") " pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.410734 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtgb\" (UniqueName: \"kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb\") pod \"auto-csr-approver-29553244-s2hjr\" (UID: \"efeef355-d6c4-454e-ab54-ed70864c4866\") " pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.433754 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtgb\" (UniqueName: \"kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb\") pod \"auto-csr-approver-29553244-s2hjr\" (UID: \"efeef355-d6c4-454e-ab54-ed70864c4866\") " pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:00 crc kubenswrapper[4744]: I0311 02:04:00.508429 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:01 crc kubenswrapper[4744]: I0311 02:04:01.010924 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553244-s2hjr"] Mar 11 02:04:01 crc kubenswrapper[4744]: I0311 02:04:01.297410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" event={"ID":"efeef355-d6c4-454e-ab54-ed70864c4866","Type":"ContainerStarted","Data":"fd04bede7ac3ef713b17e21c322f73031aa72d93f6784990e97841ed50298cc0"} Mar 11 02:04:02 crc kubenswrapper[4744]: I0311 02:04:02.307892 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" event={"ID":"efeef355-d6c4-454e-ab54-ed70864c4866","Type":"ContainerStarted","Data":"40bc43de380583bcbbe88e0d85720b655940108ce772a4e0bf566f11a31f8297"} Mar 11 02:04:02 crc kubenswrapper[4744]: I0311 02:04:02.329265 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" podStartSLOduration=1.402510143 podStartE2EDuration="2.329241506s" podCreationTimestamp="2026-03-11 02:04:00 +0000 UTC" firstStartedPulling="2026-03-11 02:04:01.027981955 +0000 UTC m=+4197.832199590" lastFinishedPulling="2026-03-11 02:04:01.954713338 +0000 UTC m=+4198.758930953" observedRunningTime="2026-03-11 02:04:02.328204074 +0000 UTC m=+4199.132421679" watchObservedRunningTime="2026-03-11 02:04:02.329241506 +0000 UTC m=+4199.133459151" Mar 11 02:04:03 crc kubenswrapper[4744]: I0311 02:04:03.320876 4744 generic.go:334] "Generic (PLEG): container finished" podID="efeef355-d6c4-454e-ab54-ed70864c4866" containerID="40bc43de380583bcbbe88e0d85720b655940108ce772a4e0bf566f11a31f8297" exitCode=0 Mar 11 02:04:03 crc kubenswrapper[4744]: I0311 02:04:03.320936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" event={"ID":"efeef355-d6c4-454e-ab54-ed70864c4866","Type":"ContainerDied","Data":"40bc43de380583bcbbe88e0d85720b655940108ce772a4e0bf566f11a31f8297"} Mar 11 02:04:04 crc kubenswrapper[4744]: I0311 02:04:04.731431 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:04 crc kubenswrapper[4744]: I0311 02:04:04.878050 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqtgb\" (UniqueName: \"kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb\") pod \"efeef355-d6c4-454e-ab54-ed70864c4866\" (UID: \"efeef355-d6c4-454e-ab54-ed70864c4866\") " Mar 11 02:04:04 crc kubenswrapper[4744]: I0311 02:04:04.883484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb" (OuterVolumeSpecName: "kube-api-access-dqtgb") pod "efeef355-d6c4-454e-ab54-ed70864c4866" (UID: "efeef355-d6c4-454e-ab54-ed70864c4866"). InnerVolumeSpecName "kube-api-access-dqtgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:04:04 crc kubenswrapper[4744]: I0311 02:04:04.980379 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqtgb\" (UniqueName: \"kubernetes.io/projected/efeef355-d6c4-454e-ab54-ed70864c4866-kube-api-access-dqtgb\") on node \"crc\" DevicePath \"\"" Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.339936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" event={"ID":"efeef355-d6c4-454e-ab54-ed70864c4866","Type":"ContainerDied","Data":"fd04bede7ac3ef713b17e21c322f73031aa72d93f6784990e97841ed50298cc0"} Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.339982 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd04bede7ac3ef713b17e21c322f73031aa72d93f6784990e97841ed50298cc0" Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.340027 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553244-s2hjr" Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.415160 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553238-662x9"] Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.422007 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553238-662x9"] Mar 11 02:04:05 crc kubenswrapper[4744]: I0311 02:04:05.990891 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de884a1e-19e9-49bd-b048-95dd915f6c51" path="/var/lib/kubelet/pods/de884a1e-19e9-49bd-b048-95dd915f6c51/volumes" Mar 11 02:04:13 crc kubenswrapper[4744]: I0311 02:04:13.978876 4744 scope.go:117] "RemoveContainer" containerID="e827d36cee92756f0c25367a0b39828d5d2770053e936ed7a5cadc9bb8e43249" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.459800 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:04:39 crc kubenswrapper[4744]: E0311 02:04:39.463098 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efeef355-d6c4-454e-ab54-ed70864c4866" containerName="oc" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.463302 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="efeef355-d6c4-454e-ab54-ed70864c4866" containerName="oc" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.463805 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="efeef355-d6c4-454e-ab54-ed70864c4866" containerName="oc" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.466271 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.487052 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.563899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.564030 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.564098 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74g7b\" (UniqueName: \"kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.665136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.665412 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.665508 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74g7b\" (UniqueName: \"kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.665609 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.665824 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.683775 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74g7b\" (UniqueName: \"kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b\") pod \"redhat-operators-rf74s\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:39 crc kubenswrapper[4744]: I0311 02:04:39.797260 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:40 crc kubenswrapper[4744]: I0311 02:04:40.328296 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:04:40 crc kubenswrapper[4744]: I0311 02:04:40.669379 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerID="15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753" exitCode=0 Mar 11 02:04:40 crc kubenswrapper[4744]: I0311 02:04:40.669495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerDied","Data":"15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753"} Mar 11 02:04:40 crc kubenswrapper[4744]: I0311 02:04:40.669917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerStarted","Data":"708011c76473243d57736ff14eea2ea0cbcac2f6197a56697bc4d7aaea873b06"} Mar 11 02:04:41 crc kubenswrapper[4744]: I0311 02:04:41.682629 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerStarted","Data":"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17"} Mar 11 02:04:42 crc kubenswrapper[4744]: I0311 02:04:42.696137 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerID="2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17" exitCode=0 Mar 11 02:04:42 crc kubenswrapper[4744]: I0311 02:04:42.696218 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerDied","Data":"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17"} Mar 11 02:04:43 crc kubenswrapper[4744]: I0311 02:04:43.705094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerStarted","Data":"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a"} Mar 11 02:04:43 crc kubenswrapper[4744]: I0311 02:04:43.730349 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rf74s" podStartSLOduration=2.217978587 podStartE2EDuration="4.730318597s" podCreationTimestamp="2026-03-11 02:04:39 +0000 UTC" firstStartedPulling="2026-03-11 02:04:40.671376726 +0000 UTC m=+4237.475594331" lastFinishedPulling="2026-03-11 02:04:43.183716726 +0000 UTC m=+4239.987934341" observedRunningTime="2026-03-11 02:04:43.730271546 +0000 UTC m=+4240.534489161" watchObservedRunningTime="2026-03-11 02:04:43.730318597 +0000 UTC m=+4240.534536232" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.667736 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.670731 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.688817 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.753698 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.753816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vm7d\" (UniqueName: \"kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.753849 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.854408 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.854480 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.854557 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vm7d\" (UniqueName: \"kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.854994 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.855044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:44 crc kubenswrapper[4744]: I0311 02:04:44.891216 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vm7d\" (UniqueName: \"kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d\") pod \"community-operators-htqb8\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:45 crc kubenswrapper[4744]: I0311 02:04:45.038346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:45 crc kubenswrapper[4744]: I0311 02:04:45.584544 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:45 crc kubenswrapper[4744]: W0311 02:04:45.588789 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2b6b55_8f48_48e2_bfc9_9a81f5b277c9.slice/crio-17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c WatchSource:0}: Error finding container 17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c: Status 404 returned error can't find the container with id 17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c Mar 11 02:04:45 crc kubenswrapper[4744]: I0311 02:04:45.723327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerStarted","Data":"17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c"} Mar 11 02:04:46 crc kubenswrapper[4744]: I0311 02:04:46.735181 4744 generic.go:334] "Generic (PLEG): container finished" podID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerID="a197e6cb1f85e2985757976aa92660b34134b1c25b89ea245a4309fb3e5e41e9" exitCode=0 Mar 11 02:04:46 crc kubenswrapper[4744]: I0311 02:04:46.735274 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerDied","Data":"a197e6cb1f85e2985757976aa92660b34134b1c25b89ea245a4309fb3e5e41e9"} Mar 11 02:04:48 crc kubenswrapper[4744]: I0311 02:04:48.771028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerStarted","Data":"e586838ae2edbe785f0c592f366bdbbd0ad3db1a0795449cbf87f600db07ff60"} Mar 11 02:04:49 crc kubenswrapper[4744]: I0311 02:04:49.783382 4744 generic.go:334] "Generic (PLEG): container finished" podID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerID="e586838ae2edbe785f0c592f366bdbbd0ad3db1a0795449cbf87f600db07ff60" exitCode=0 Mar 11 02:04:49 crc kubenswrapper[4744]: I0311 02:04:49.783443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerDied","Data":"e586838ae2edbe785f0c592f366bdbbd0ad3db1a0795449cbf87f600db07ff60"} Mar 11 02:04:49 crc kubenswrapper[4744]: I0311 02:04:49.798035 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:49 crc kubenswrapper[4744]: I0311 02:04:49.798384 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:50 crc kubenswrapper[4744]: I0311 02:04:50.793026 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerStarted","Data":"75d3a3c1000ed49f84fa304b5634a8ed29618da3e31f26e7569c57d4096c40ab"} Mar 11 02:04:50 crc kubenswrapper[4744]: I0311 02:04:50.831076 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-htqb8" podStartSLOduration=3.34971959 podStartE2EDuration="6.83105176s" podCreationTimestamp="2026-03-11 02:04:44 +0000 UTC" firstStartedPulling="2026-03-11 02:04:46.737724465 +0000 UTC m=+4243.541942110" lastFinishedPulling="2026-03-11 02:04:50.219056635 +0000 UTC m=+4247.023274280" observedRunningTime="2026-03-11 02:04:50.809290517 +0000 UTC m=+4247.613508222" watchObservedRunningTime="2026-03-11 02:04:50.83105176 +0000 UTC m=+4247.635269365" Mar 11 02:04:51 crc kubenswrapper[4744]: I0311 02:04:51.437852 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rf74s" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="registry-server" probeResult="failure" output=< Mar 11 02:04:51 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 02:04:51 crc kubenswrapper[4744]: > Mar 11 02:04:55 crc kubenswrapper[4744]: I0311 02:04:55.039393 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:55 crc kubenswrapper[4744]: I0311 02:04:55.040163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:55 crc kubenswrapper[4744]: I0311 02:04:55.114298 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:55 crc kubenswrapper[4744]: I0311 02:04:55.699054 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:56 crc kubenswrapper[4744]: I0311 02:04:56.054022 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:57 crc kubenswrapper[4744]: I0311 02:04:57.654886 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-htqb8" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="registry-server" containerID="cri-o://75d3a3c1000ed49f84fa304b5634a8ed29618da3e31f26e7569c57d4096c40ab" gracePeriod=2 Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.666462 4744 generic.go:334] "Generic (PLEG): container finished" podID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerID="75d3a3c1000ed49f84fa304b5634a8ed29618da3e31f26e7569c57d4096c40ab" exitCode=0 Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.666849 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerDied","Data":"75d3a3c1000ed49f84fa304b5634a8ed29618da3e31f26e7569c57d4096c40ab"} Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.666902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htqb8" event={"ID":"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9","Type":"ContainerDied","Data":"17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c"} Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.666920 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ebee2c466d85abc6968c45ac11c2459cc6dd228df0ed5f9ae6d976f9de1c7c" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.687362 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.782397 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vm7d\" (UniqueName: \"kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d\") pod \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.782561 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content\") pod \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.782617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities\") pod \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\" (UID: \"bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9\") " Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.784059 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities" (OuterVolumeSpecName: "utilities") pod "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" (UID: "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.788908 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d" (OuterVolumeSpecName: "kube-api-access-7vm7d") pod "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" (UID: "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9"). InnerVolumeSpecName "kube-api-access-7vm7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.853439 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" (UID: "bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.884294 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.884338 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:04:58 crc kubenswrapper[4744]: I0311 02:04:58.884358 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vm7d\" (UniqueName: \"kubernetes.io/projected/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9-kube-api-access-7vm7d\") on node \"crc\" DevicePath \"\"" Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.676021 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htqb8" Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.741003 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.753745 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-htqb8"] Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.880435 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.955573 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:04:59 crc kubenswrapper[4744]: I0311 02:04:59.994902 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" path="/var/lib/kubelet/pods/bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9/volumes" Mar 11 02:05:01 crc kubenswrapper[4744]: I0311 02:05:01.453246 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:05:01 crc kubenswrapper[4744]: I0311 02:05:01.699859 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rf74s" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="registry-server" containerID="cri-o://7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a" gracePeriod=2 Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.180421 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.254202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities\") pod \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.254329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74g7b\" (UniqueName: \"kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b\") pod \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.254351 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content\") pod \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\" (UID: \"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf\") " Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.255759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities" (OuterVolumeSpecName: "utilities") pod "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" (UID: "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.263757 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b" (OuterVolumeSpecName: "kube-api-access-74g7b") pod "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" (UID: "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf"). InnerVolumeSpecName "kube-api-access-74g7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.355991 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74g7b\" (UniqueName: \"kubernetes.io/projected/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-kube-api-access-74g7b\") on node \"crc\" DevicePath \"\"" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.356020 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.431529 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" (UID: "ac9ac009-2f5c-473d-9fd2-2bf19331ebbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.457295 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.709966 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerID="7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a" exitCode=0 Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.710029 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerDied","Data":"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a"} Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.710069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf74s" event={"ID":"ac9ac009-2f5c-473d-9fd2-2bf19331ebbf","Type":"ContainerDied","Data":"708011c76473243d57736ff14eea2ea0cbcac2f6197a56697bc4d7aaea873b06"} Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.710105 4744 scope.go:117] "RemoveContainer" containerID="7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.710144 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf74s" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.732143 4744 scope.go:117] "RemoveContainer" containerID="2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.762262 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.769192 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rf74s"] Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.771136 4744 scope.go:117] "RemoveContainer" containerID="15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.807957 4744 scope.go:117] "RemoveContainer" containerID="7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a" Mar 11 02:05:02 crc kubenswrapper[4744]: E0311 02:05:02.808599 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a\": container with ID starting with 7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a not found: ID does not exist" containerID="7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.808647 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a"} err="failed to get container status \"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a\": rpc error: code = NotFound desc = could not find container \"7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a\": container with ID starting with 7401d8c100774abdaf862b2cfa374e5ab2c45bf73cff922abf544576096cf46a not found: ID does not exist" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.808671 4744 scope.go:117] "RemoveContainer" containerID="2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17" Mar 11 02:05:02 crc kubenswrapper[4744]: E0311 02:05:02.809191 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17\": container with ID starting with 2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17 not found: ID does not exist" containerID="2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.809236 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17"} err="failed to get container status \"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17\": rpc error: code = NotFound desc = could not find container \"2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17\": container with ID starting with 2e5d54ca5b6e015510c272654dd4580fa88382938a705d6d64a64b5263eb9d17 not found: ID does not exist" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.809276 4744 scope.go:117] "RemoveContainer" containerID="15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753" Mar 11 02:05:02 crc kubenswrapper[4744]: E0311 02:05:02.809832 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753\": container with ID starting with 15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753 not found: ID does not exist" containerID="15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753" Mar 11 02:05:02 crc kubenswrapper[4744]: I0311 02:05:02.809986 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753"} err="failed to get container status \"15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753\": rpc error: code = NotFound desc = could not find container \"15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753\": container with ID starting with 15231fcd887c8d0eb5d471b4f568631c52e9153ae409e0eee2bb5b93f8cb4753 not found: ID does not exist" Mar 11 02:05:03 crc kubenswrapper[4744]: I0311 02:05:03.990767 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" path="/var/lib/kubelet/pods/ac9ac009-2f5c-473d-9fd2-2bf19331ebbf/volumes" Mar 11 02:05:12 crc kubenswrapper[4744]: I0311 02:05:12.409372 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:05:12 crc kubenswrapper[4744]: I0311 02:05:12.411644 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:05:42 crc kubenswrapper[4744]: I0311 02:05:42.409481 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:05:42 crc kubenswrapper[4744]: I0311 02:05:42.410165 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.167985 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553246-9rjnf"] Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169126 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="extract-content" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169150 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="extract-content" Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169184 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="extract-utilities" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169199 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="extract-utilities" Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169226 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169240 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169263 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="extract-utilities" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169275 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="extract-utilities" Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169307 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="extract-content" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169320 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="extract-content" Mar 11 02:06:00 crc kubenswrapper[4744]: E0311 02:06:00.169351 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169364 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169799 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2b6b55-8f48-48e2-bfc9-9a81f5b277c9" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.169838 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac9ac009-2f5c-473d-9fd2-2bf19331ebbf" containerName="registry-server" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.170917 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.174964 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.175634 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.177270 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.180575 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553246-9rjnf"] Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.361438 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7hr2\" (UniqueName: \"kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2\") pod \"auto-csr-approver-29553246-9rjnf\" (UID: \"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9\") " pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.463116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7hr2\" (UniqueName: \"kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2\") pod \"auto-csr-approver-29553246-9rjnf\" (UID: \"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9\") " pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.498704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7hr2\" (UniqueName: \"kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2\") pod \"auto-csr-approver-29553246-9rjnf\" (UID: \"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9\") " pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.509010 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.814136 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553246-9rjnf"] Mar 11 02:06:00 crc kubenswrapper[4744]: I0311 02:06:00.874855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" event={"ID":"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9","Type":"ContainerStarted","Data":"10421e8c40742ab0a01e63d29dafd032134d31d122647c36b3e731fa8b78348a"} Mar 11 02:06:02 crc kubenswrapper[4744]: I0311 02:06:02.898338 4744 generic.go:334] "Generic (PLEG): container finished" podID="f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" containerID="35f6dc0378397408500a41110d3b6ba8ee433f88cdbaeb2bf2a657dab34c270e" exitCode=0 Mar 11 02:06:02 crc kubenswrapper[4744]: I0311 02:06:02.898412 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" event={"ID":"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9","Type":"ContainerDied","Data":"35f6dc0378397408500a41110d3b6ba8ee433f88cdbaeb2bf2a657dab34c270e"} Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.269794 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.436155 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7hr2\" (UniqueName: \"kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2\") pod \"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9\" (UID: \"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9\") " Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.453042 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2" (OuterVolumeSpecName: "kube-api-access-s7hr2") pod "f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" (UID: "f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9"). InnerVolumeSpecName "kube-api-access-s7hr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.537633 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7hr2\" (UniqueName: \"kubernetes.io/projected/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9-kube-api-access-s7hr2\") on node \"crc\" DevicePath \"\"" Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.920649 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" event={"ID":"f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9","Type":"ContainerDied","Data":"10421e8c40742ab0a01e63d29dafd032134d31d122647c36b3e731fa8b78348a"} Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.920904 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10421e8c40742ab0a01e63d29dafd032134d31d122647c36b3e731fa8b78348a" Mar 11 02:06:04 crc kubenswrapper[4744]: I0311 02:06:04.920731 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553246-9rjnf" Mar 11 02:06:05 crc kubenswrapper[4744]: I0311 02:06:05.363639 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553240-qfzrv"] Mar 11 02:06:05 crc kubenswrapper[4744]: I0311 02:06:05.373656 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553240-qfzrv"] Mar 11 02:06:05 crc kubenswrapper[4744]: I0311 02:06:05.987896 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5295248-4dc5-4d74-b195-aef107864ca5" path="/var/lib/kubelet/pods/d5295248-4dc5-4d74-b195-aef107864ca5/volumes" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.408897 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.409394 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.409447 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.410144 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.410249 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" gracePeriod=600 Mar 11 02:06:12 crc kubenswrapper[4744]: E0311 02:06:12.554872 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.990774 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" exitCode=0 Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.990884 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921"} Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.991477 4744 scope.go:117] "RemoveContainer" containerID="31392fc530cf8815a26953cb2dc91784522d6a632ca8ff32cfa8a61da78aa9d3" Mar 11 02:06:12 crc kubenswrapper[4744]: I0311 02:06:12.992168 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:06:12 crc kubenswrapper[4744]: E0311 02:06:12.992576 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:06:14 crc kubenswrapper[4744]: I0311 02:06:14.157139 4744 scope.go:117] "RemoveContainer" containerID="ff52bdb2a1a3df95d4a31c7119c69c9c1029cb28353134523ad21090eb1558a4" Mar 11 02:06:23 crc kubenswrapper[4744]: I0311 02:06:23.981992 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:06:23 crc kubenswrapper[4744]: E0311 02:06:23.983096 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:06:34 crc kubenswrapper[4744]: I0311 02:06:34.975241 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:06:34 crc kubenswrapper[4744]: E0311 02:06:34.975944 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:06:46 crc kubenswrapper[4744]: I0311 02:06:46.974759 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:06:46 crc kubenswrapper[4744]: E0311 02:06:46.975427 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:07:00 crc kubenswrapper[4744]: I0311 02:07:00.975591 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:07:00 crc kubenswrapper[4744]: E0311 02:07:00.976689 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:07:13 crc kubenswrapper[4744]: I0311 02:07:13.981843 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:07:13 crc kubenswrapper[4744]: E0311 02:07:13.982915 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:07:25 crc kubenswrapper[4744]: I0311 02:07:25.975677 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:07:25 crc kubenswrapper[4744]: E0311 02:07:25.976745 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:07:40 crc kubenswrapper[4744]: I0311 02:07:40.975479 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:07:40 crc kubenswrapper[4744]: E0311 02:07:40.976798 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:07:54 crc kubenswrapper[4744]: I0311 02:07:54.975000 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:07:54 crc kubenswrapper[4744]: E0311 02:07:54.976003 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.168790 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553248-94tnc"] Mar 11 02:08:00 crc kubenswrapper[4744]: E0311 02:08:00.169790 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" containerName="oc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.169818 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" containerName="oc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.170079 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" containerName="oc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.170826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.176363 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.176591 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.176383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.188059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553248-94tnc"] Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.220481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjl7\" (UniqueName: \"kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7\") pod \"auto-csr-approver-29553248-94tnc\" (UID: \"ae2bed9d-6b5c-4643-98d5-2eeec111948c\") " pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.322341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjl7\" (UniqueName: \"kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7\") pod \"auto-csr-approver-29553248-94tnc\" (UID: \"ae2bed9d-6b5c-4643-98d5-2eeec111948c\") " pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.355206 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjl7\" (UniqueName: \"kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7\") pod \"auto-csr-approver-29553248-94tnc\" (UID: \"ae2bed9d-6b5c-4643-98d5-2eeec111948c\") " pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.502410 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:00 crc kubenswrapper[4744]: I0311 02:08:00.990187 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553248-94tnc"] Mar 11 02:08:00 crc kubenswrapper[4744]: W0311 02:08:00.997238 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae2bed9d_6b5c_4643_98d5_2eeec111948c.slice/crio-c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1 WatchSource:0}: Error finding container c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1: Status 404 returned error can't find the container with id c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1 Mar 11 02:08:01 crc kubenswrapper[4744]: I0311 02:08:01.000542 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:08:01 crc kubenswrapper[4744]: I0311 02:08:01.999687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553248-94tnc" event={"ID":"ae2bed9d-6b5c-4643-98d5-2eeec111948c","Type":"ContainerStarted","Data":"c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1"} Mar 11 02:08:03 crc kubenswrapper[4744]: I0311 02:08:03.013837 4744 generic.go:334] "Generic (PLEG): container finished" podID="ae2bed9d-6b5c-4643-98d5-2eeec111948c" containerID="b319ca82c048c5ae85a476bc22562632793f5846a88ce68994a41c978dc8c991" exitCode=0 Mar 11 02:08:03 crc kubenswrapper[4744]: I0311 02:08:03.014237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553248-94tnc" event={"ID":"ae2bed9d-6b5c-4643-98d5-2eeec111948c","Type":"ContainerDied","Data":"b319ca82c048c5ae85a476bc22562632793f5846a88ce68994a41c978dc8c991"} Mar 11 02:08:04 crc kubenswrapper[4744]: I0311 02:08:04.318562 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:04 crc kubenswrapper[4744]: I0311 02:08:04.489890 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjl7\" (UniqueName: \"kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7\") pod \"ae2bed9d-6b5c-4643-98d5-2eeec111948c\" (UID: \"ae2bed9d-6b5c-4643-98d5-2eeec111948c\") " Mar 11 02:08:04 crc kubenswrapper[4744]: I0311 02:08:04.499786 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7" (OuterVolumeSpecName: "kube-api-access-rqjl7") pod "ae2bed9d-6b5c-4643-98d5-2eeec111948c" (UID: "ae2bed9d-6b5c-4643-98d5-2eeec111948c"). InnerVolumeSpecName "kube-api-access-rqjl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:08:04 crc kubenswrapper[4744]: I0311 02:08:04.592246 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjl7\" (UniqueName: \"kubernetes.io/projected/ae2bed9d-6b5c-4643-98d5-2eeec111948c-kube-api-access-rqjl7\") on node \"crc\" DevicePath \"\"" Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.032018 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553248-94tnc" event={"ID":"ae2bed9d-6b5c-4643-98d5-2eeec111948c","Type":"ContainerDied","Data":"c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1"} Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.032053 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55126d6ff7d48c738bd2bf5becb4f6256f76f8cc81ac534a8585694639faed1" Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.032078 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553248-94tnc" Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.397600 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553242-c9ggj"] Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.402493 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553242-c9ggj"] Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.974703 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:08:05 crc kubenswrapper[4744]: E0311 02:08:05.975341 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:08:05 crc kubenswrapper[4744]: I0311 02:08:05.989261 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6abb1e5-6401-4c2e-862f-7762ed28775e" path="/var/lib/kubelet/pods/e6abb1e5-6401-4c2e-862f-7762ed28775e/volumes" Mar 11 02:08:14 crc kubenswrapper[4744]: I0311 02:08:14.253640 4744 scope.go:117] "RemoveContainer" containerID="69ad3b48812eee0d5e9f23e72288bd4ed9437e16e887d68dc7fb0e1ccc69603a" Mar 11 02:08:16 crc kubenswrapper[4744]: I0311 02:08:16.975268 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:08:16 crc kubenswrapper[4744]: E0311 02:08:16.975786 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:08:27 crc kubenswrapper[4744]: I0311 02:08:27.974798 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:08:27 crc kubenswrapper[4744]: E0311 02:08:27.975657 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:08:39 crc kubenswrapper[4744]: I0311 02:08:39.976115 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:08:39 crc kubenswrapper[4744]: E0311 02:08:39.977017 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:08:51 crc kubenswrapper[4744]: I0311 02:08:51.975806 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:08:51 crc kubenswrapper[4744]: E0311 02:08:51.976808 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:09:05 crc kubenswrapper[4744]: I0311 02:09:05.974827 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:09:05 crc kubenswrapper[4744]: E0311 02:09:05.975772 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:09:20 crc kubenswrapper[4744]: I0311 02:09:20.975688 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:09:20 crc kubenswrapper[4744]: E0311 02:09:20.977010 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:09:34 crc kubenswrapper[4744]: I0311 02:09:34.975818 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:09:34 crc kubenswrapper[4744]: E0311 02:09:34.978785 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:09:48 crc kubenswrapper[4744]: I0311 02:09:48.974845 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:09:48 crc kubenswrapper[4744]: E0311 02:09:48.976012 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.169123 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553250-bmnkn"] Mar 11 02:10:00 crc kubenswrapper[4744]: E0311 02:10:00.170184 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2bed9d-6b5c-4643-98d5-2eeec111948c" containerName="oc" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.170206 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2bed9d-6b5c-4643-98d5-2eeec111948c" containerName="oc" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.170448 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2bed9d-6b5c-4643-98d5-2eeec111948c" containerName="oc" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.171149 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.174858 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.174873 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.175765 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.192341 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553250-bmnkn"] Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.192551 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc7q2\" (UniqueName: \"kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2\") pod \"auto-csr-approver-29553250-bmnkn\" (UID: \"a213726e-9cd5-4470-8725-769648f3002c\") " pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.294814 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc7q2\" (UniqueName: \"kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2\") pod \"auto-csr-approver-29553250-bmnkn\" (UID: \"a213726e-9cd5-4470-8725-769648f3002c\") " pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.315717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc7q2\" (UniqueName: \"kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2\") pod \"auto-csr-approver-29553250-bmnkn\" (UID: \"a213726e-9cd5-4470-8725-769648f3002c\") " pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.532790 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:00 crc kubenswrapper[4744]: I0311 02:10:00.820156 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553250-bmnkn"] Mar 11 02:10:01 crc kubenswrapper[4744]: I0311 02:10:01.142220 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" event={"ID":"a213726e-9cd5-4470-8725-769648f3002c","Type":"ContainerStarted","Data":"c27dcbab82a10ac5483b9d2470ab6664ddc4d9661e74c12ebc6af42c2627202f"} Mar 11 02:10:01 crc kubenswrapper[4744]: I0311 02:10:01.974765 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:10:01 crc kubenswrapper[4744]: E0311 02:10:01.975189 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:10:03 crc kubenswrapper[4744]: I0311 02:10:03.161389 4744 generic.go:334] "Generic (PLEG): container finished" podID="a213726e-9cd5-4470-8725-769648f3002c" containerID="0ddbe5e315d6627cf58fa17ebf3562702027a0344dd287d5bebdda5faba1638a" exitCode=0 Mar 11 02:10:03 crc kubenswrapper[4744]: I0311 02:10:03.161454 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" event={"ID":"a213726e-9cd5-4470-8725-769648f3002c","Type":"ContainerDied","Data":"0ddbe5e315d6627cf58fa17ebf3562702027a0344dd287d5bebdda5faba1638a"} Mar 11 02:10:04 crc kubenswrapper[4744]: I0311 02:10:04.564365 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:04 crc kubenswrapper[4744]: I0311 02:10:04.673095 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc7q2\" (UniqueName: \"kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2\") pod \"a213726e-9cd5-4470-8725-769648f3002c\" (UID: \"a213726e-9cd5-4470-8725-769648f3002c\") " Mar 11 02:10:04 crc kubenswrapper[4744]: I0311 02:10:04.686121 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2" (OuterVolumeSpecName: "kube-api-access-cc7q2") pod "a213726e-9cd5-4470-8725-769648f3002c" (UID: "a213726e-9cd5-4470-8725-769648f3002c"). InnerVolumeSpecName "kube-api-access-cc7q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:10:04 crc kubenswrapper[4744]: I0311 02:10:04.775501 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc7q2\" (UniqueName: \"kubernetes.io/projected/a213726e-9cd5-4470-8725-769648f3002c-kube-api-access-cc7q2\") on node \"crc\" DevicePath \"\"" Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.188321 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" event={"ID":"a213726e-9cd5-4470-8725-769648f3002c","Type":"ContainerDied","Data":"c27dcbab82a10ac5483b9d2470ab6664ddc4d9661e74c12ebc6af42c2627202f"} Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.188799 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27dcbab82a10ac5483b9d2470ab6664ddc4d9661e74c12ebc6af42c2627202f" Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.188918 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553250-bmnkn" Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.659200 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553244-s2hjr"] Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.672333 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553244-s2hjr"] Mar 11 02:10:05 crc kubenswrapper[4744]: I0311 02:10:05.989728 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efeef355-d6c4-454e-ab54-ed70864c4866" path="/var/lib/kubelet/pods/efeef355-d6c4-454e-ab54-ed70864c4866/volumes" Mar 11 02:10:14 crc kubenswrapper[4744]: I0311 02:10:14.384871 4744 scope.go:117] "RemoveContainer" containerID="40bc43de380583bcbbe88e0d85720b655940108ce772a4e0bf566f11a31f8297" Mar 11 02:10:15 crc kubenswrapper[4744]: I0311 02:10:15.975627 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:10:15 crc kubenswrapper[4744]: E0311 02:10:15.976367 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:10:27 crc kubenswrapper[4744]: I0311 02:10:27.974703 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:10:27 crc kubenswrapper[4744]: E0311 02:10:27.975699 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:10:38 crc kubenswrapper[4744]: I0311 02:10:38.975227 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:10:38 crc kubenswrapper[4744]: E0311 02:10:38.976304 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:10:52 crc kubenswrapper[4744]: I0311 02:10:52.975368 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:10:52 crc kubenswrapper[4744]: E0311 02:10:52.976482 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:11:05 crc kubenswrapper[4744]: I0311 02:11:05.975076 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:11:05 crc kubenswrapper[4744]: E0311 02:11:05.976309 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:11:14 crc kubenswrapper[4744]: I0311 02:11:14.486123 4744 scope.go:117] "RemoveContainer" containerID="75d3a3c1000ed49f84fa304b5634a8ed29618da3e31f26e7569c57d4096c40ab" Mar 11 02:11:14 crc kubenswrapper[4744]: I0311 02:11:14.515570 4744 scope.go:117] "RemoveContainer" containerID="e586838ae2edbe785f0c592f366bdbbd0ad3db1a0795449cbf87f600db07ff60" Mar 11 02:11:14 crc kubenswrapper[4744]: I0311 02:11:14.550759 4744 scope.go:117] "RemoveContainer" containerID="a197e6cb1f85e2985757976aa92660b34134b1c25b89ea245a4309fb3e5e41e9" Mar 11 02:11:19 crc kubenswrapper[4744]: I0311 02:11:19.975466 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:11:20 crc kubenswrapper[4744]: I0311 02:11:20.890095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0"} Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.145821 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553252-jfhmn"] Mar 11 02:12:00 crc kubenswrapper[4744]: E0311 02:12:00.147737 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a213726e-9cd5-4470-8725-769648f3002c" containerName="oc" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.147833 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a213726e-9cd5-4470-8725-769648f3002c" containerName="oc" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.148025 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a213726e-9cd5-4470-8725-769648f3002c" containerName="oc" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.148479 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.150706 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.151012 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.151980 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.170938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553252-jfhmn"] Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.255717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ztrz\" (UniqueName: \"kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz\") pod \"auto-csr-approver-29553252-jfhmn\" (UID: \"5ee1e625-6b01-487f-8462-104afacd05e7\") " pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.357989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ztrz\" (UniqueName: \"kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz\") pod \"auto-csr-approver-29553252-jfhmn\" (UID: \"5ee1e625-6b01-487f-8462-104afacd05e7\") " pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.378270 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ztrz\" (UniqueName: \"kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz\") pod \"auto-csr-approver-29553252-jfhmn\" (UID: \"5ee1e625-6b01-487f-8462-104afacd05e7\") " pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.483032 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:00 crc kubenswrapper[4744]: I0311 02:12:00.997608 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553252-jfhmn"] Mar 11 02:12:01 crc kubenswrapper[4744]: I0311 02:12:01.277848 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" event={"ID":"5ee1e625-6b01-487f-8462-104afacd05e7","Type":"ContainerStarted","Data":"0d8872701bd442c6ac952b36d0165e32f9447f6fb3f5e6c61b072cfc609a77ff"} Mar 11 02:12:03 crc kubenswrapper[4744]: I0311 02:12:03.317179 4744 generic.go:334] "Generic (PLEG): container finished" podID="5ee1e625-6b01-487f-8462-104afacd05e7" containerID="8c3928e4ec7e5bc9173bdc1f879fb35a19a9ce81fdb67bfc6438f584fd72ac27" exitCode=0 Mar 11 02:12:03 crc kubenswrapper[4744]: I0311 02:12:03.317298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" event={"ID":"5ee1e625-6b01-487f-8462-104afacd05e7","Type":"ContainerDied","Data":"8c3928e4ec7e5bc9173bdc1f879fb35a19a9ce81fdb67bfc6438f584fd72ac27"} Mar 11 02:12:04 crc kubenswrapper[4744]: I0311 02:12:04.707206 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:04 crc kubenswrapper[4744]: I0311 02:12:04.734942 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ztrz\" (UniqueName: \"kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz\") pod \"5ee1e625-6b01-487f-8462-104afacd05e7\" (UID: \"5ee1e625-6b01-487f-8462-104afacd05e7\") " Mar 11 02:12:04 crc kubenswrapper[4744]: I0311 02:12:04.740308 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz" (OuterVolumeSpecName: "kube-api-access-5ztrz") pod "5ee1e625-6b01-487f-8462-104afacd05e7" (UID: "5ee1e625-6b01-487f-8462-104afacd05e7"). InnerVolumeSpecName "kube-api-access-5ztrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:12:04 crc kubenswrapper[4744]: I0311 02:12:04.836338 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ztrz\" (UniqueName: \"kubernetes.io/projected/5ee1e625-6b01-487f-8462-104afacd05e7-kube-api-access-5ztrz\") on node \"crc\" DevicePath \"\"" Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.333086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" event={"ID":"5ee1e625-6b01-487f-8462-104afacd05e7","Type":"ContainerDied","Data":"0d8872701bd442c6ac952b36d0165e32f9447f6fb3f5e6c61b072cfc609a77ff"} Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.333428 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8872701bd442c6ac952b36d0165e32f9447f6fb3f5e6c61b072cfc609a77ff" Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.333151 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553252-jfhmn" Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.814111 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553246-9rjnf"] Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.823405 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553246-9rjnf"] Mar 11 02:12:05 crc kubenswrapper[4744]: I0311 02:12:05.986352 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9" path="/var/lib/kubelet/pods/f89fcb79-ebd1-445b-a6f6-4ea3ca6335c9/volumes" Mar 11 02:12:14 crc kubenswrapper[4744]: I0311 02:12:14.635045 4744 scope.go:117] "RemoveContainer" containerID="35f6dc0378397408500a41110d3b6ba8ee433f88cdbaeb2bf2a657dab34c270e" Mar 11 02:13:42 crc kubenswrapper[4744]: I0311 02:13:42.409787 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:13:42 crc kubenswrapper[4744]: I0311 02:13:42.410484 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.168739 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553254-w8rth"] Mar 11 02:14:00 crc kubenswrapper[4744]: E0311 02:14:00.170297 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee1e625-6b01-487f-8462-104afacd05e7" containerName="oc" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.170325 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1e625-6b01-487f-8462-104afacd05e7" containerName="oc" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.170626 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee1e625-6b01-487f-8462-104afacd05e7" containerName="oc" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.171358 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.175086 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.175131 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.175987 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.183714 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553254-w8rth"] Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.204803 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnlm\" (UniqueName: \"kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm\") pod \"auto-csr-approver-29553254-w8rth\" (UID: \"ff8bcff3-14f9-4749-bf02-147262a8384c\") " pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.306297 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnlm\" (UniqueName: \"kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm\") pod \"auto-csr-approver-29553254-w8rth\" (UID: \"ff8bcff3-14f9-4749-bf02-147262a8384c\") " pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.339046 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnlm\" (UniqueName: \"kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm\") pod \"auto-csr-approver-29553254-w8rth\" (UID: \"ff8bcff3-14f9-4749-bf02-147262a8384c\") " pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.503804 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.955934 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553254-w8rth"] Mar 11 02:14:00 crc kubenswrapper[4744]: I0311 02:14:00.974687 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:14:01 crc kubenswrapper[4744]: I0311 02:14:01.509849 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553254-w8rth" event={"ID":"ff8bcff3-14f9-4749-bf02-147262a8384c","Type":"ContainerStarted","Data":"646fe6f6619e8518676abc017df1d6931ebabce4583cd02c14b670fd7da797f7"} Mar 11 02:14:01 crc kubenswrapper[4744]: I0311 02:14:01.914172 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:01 crc kubenswrapper[4744]: I0311 02:14:01.917874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:01 crc kubenswrapper[4744]: I0311 02:14:01.926863 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.045748 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.045859 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9k5m\" (UniqueName: \"kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.046739 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.147641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.147732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.147764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9k5m\" (UniqueName: \"kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.148443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.148491 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.185402 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9k5m\" (UniqueName: \"kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m\") pod \"redhat-marketplace-8gwvw\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.266556 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.522322 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.534320 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553254-w8rth" event={"ID":"ff8bcff3-14f9-4749-bf02-147262a8384c","Type":"ContainerStarted","Data":"59fe0ddd69ed4d66043f9cd786907504f5ca9fbc1a188bde3b954af3e59fe105"} Mar 11 02:14:02 crc kubenswrapper[4744]: I0311 02:14:02.553275 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553254-w8rth" podStartSLOduration=1.463901741 podStartE2EDuration="2.553246222s" podCreationTimestamp="2026-03-11 02:14:00 +0000 UTC" firstStartedPulling="2026-03-11 02:14:00.971844694 +0000 UTC m=+4797.776062329" lastFinishedPulling="2026-03-11 02:14:02.061189205 +0000 UTC m=+4798.865406810" observedRunningTime="2026-03-11 02:14:02.547765463 +0000 UTC m=+4799.351983058" watchObservedRunningTime="2026-03-11 02:14:02.553246222 +0000 UTC m=+4799.357463827" Mar 11 02:14:03 crc kubenswrapper[4744]: I0311 02:14:03.547015 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff8bcff3-14f9-4749-bf02-147262a8384c" containerID="59fe0ddd69ed4d66043f9cd786907504f5ca9fbc1a188bde3b954af3e59fe105" exitCode=0 Mar 11 02:14:03 crc kubenswrapper[4744]: I0311 02:14:03.547089 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553254-w8rth" event={"ID":"ff8bcff3-14f9-4749-bf02-147262a8384c","Type":"ContainerDied","Data":"59fe0ddd69ed4d66043f9cd786907504f5ca9fbc1a188bde3b954af3e59fe105"} Mar 11 02:14:03 crc kubenswrapper[4744]: I0311 02:14:03.550355 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerID="6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae" exitCode=0 Mar 11 02:14:03 crc kubenswrapper[4744]: I0311 02:14:03.550409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerDied","Data":"6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae"} Mar 11 02:14:03 crc kubenswrapper[4744]: I0311 02:14:03.550439 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerStarted","Data":"e2fc117718153029e4a271a6d1c7010eb4f92ab915ad1e4f9659581baf25acf2"} Mar 11 02:14:04 crc kubenswrapper[4744]: I0311 02:14:04.557947 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerID="2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4" exitCode=0 Mar 11 02:14:04 crc kubenswrapper[4744]: I0311 02:14:04.558060 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerDied","Data":"2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4"} Mar 11 02:14:04 crc kubenswrapper[4744]: I0311 02:14:04.895245 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:04 crc kubenswrapper[4744]: I0311 02:14:04.992446 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnlm\" (UniqueName: \"kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm\") pod \"ff8bcff3-14f9-4749-bf02-147262a8384c\" (UID: \"ff8bcff3-14f9-4749-bf02-147262a8384c\") " Mar 11 02:14:04 crc kubenswrapper[4744]: I0311 02:14:04.999073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm" (OuterVolumeSpecName: "kube-api-access-wgnlm") pod "ff8bcff3-14f9-4749-bf02-147262a8384c" (UID: "ff8bcff3-14f9-4749-bf02-147262a8384c"). InnerVolumeSpecName "kube-api-access-wgnlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.096101 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnlm\" (UniqueName: \"kubernetes.io/projected/ff8bcff3-14f9-4749-bf02-147262a8384c-kube-api-access-wgnlm\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.564944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerStarted","Data":"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab"} Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.567264 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553254-w8rth" event={"ID":"ff8bcff3-14f9-4749-bf02-147262a8384c","Type":"ContainerDied","Data":"646fe6f6619e8518676abc017df1d6931ebabce4583cd02c14b670fd7da797f7"} Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.567305 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553254-w8rth" Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.567310 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="646fe6f6619e8518676abc017df1d6931ebabce4583cd02c14b670fd7da797f7" Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.597701 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8gwvw" podStartSLOduration=3.112465954 podStartE2EDuration="4.597680632s" podCreationTimestamp="2026-03-11 02:14:01 +0000 UTC" firstStartedPulling="2026-03-11 02:14:03.552779411 +0000 UTC m=+4800.356997056" lastFinishedPulling="2026-03-11 02:14:05.037994119 +0000 UTC m=+4801.842211734" observedRunningTime="2026-03-11 02:14:05.594170005 +0000 UTC m=+4802.398387610" watchObservedRunningTime="2026-03-11 02:14:05.597680632 +0000 UTC m=+4802.401898237" Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.623059 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553248-94tnc"] Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.627942 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553248-94tnc"] Mar 11 02:14:05 crc kubenswrapper[4744]: I0311 02:14:05.985248 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2bed9d-6b5c-4643-98d5-2eeec111948c" path="/var/lib/kubelet/pods/ae2bed9d-6b5c-4643-98d5-2eeec111948c/volumes" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.267084 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.267552 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.334276 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.408970 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.409027 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.700888 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:12 crc kubenswrapper[4744]: I0311 02:14:12.759049 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:14 crc kubenswrapper[4744]: I0311 02:14:14.642272 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8gwvw" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="registry-server" containerID="cri-o://2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab" gracePeriod=2 Mar 11 02:14:14 crc kubenswrapper[4744]: I0311 02:14:14.748814 4744 scope.go:117] "RemoveContainer" containerID="b319ca82c048c5ae85a476bc22562632793f5846a88ce68994a41c978dc8c991" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.156642 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.354205 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content\") pod \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.354375 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9k5m\" (UniqueName: \"kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m\") pod \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.354444 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities\") pod \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\" (UID: \"7a9a82e0-1bd6-4ee9-b71a-947542ea6730\") " Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.356218 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities" (OuterVolumeSpecName: "utilities") pod "7a9a82e0-1bd6-4ee9-b71a-947542ea6730" (UID: "7a9a82e0-1bd6-4ee9-b71a-947542ea6730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.370388 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m" (OuterVolumeSpecName: "kube-api-access-r9k5m") pod "7a9a82e0-1bd6-4ee9-b71a-947542ea6730" (UID: "7a9a82e0-1bd6-4ee9-b71a-947542ea6730"). InnerVolumeSpecName "kube-api-access-r9k5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.406690 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a9a82e0-1bd6-4ee9-b71a-947542ea6730" (UID: "7a9a82e0-1bd6-4ee9-b71a-947542ea6730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.457157 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.457836 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9k5m\" (UniqueName: \"kubernetes.io/projected/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-kube-api-access-r9k5m\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.457870 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a82e0-1bd6-4ee9-b71a-947542ea6730-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.671570 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerID="2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab" exitCode=0 Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.671628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerDied","Data":"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab"} Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.671665 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gwvw" event={"ID":"7a9a82e0-1bd6-4ee9-b71a-947542ea6730","Type":"ContainerDied","Data":"e2fc117718153029e4a271a6d1c7010eb4f92ab915ad1e4f9659581baf25acf2"} Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.671698 4744 scope.go:117] "RemoveContainer" containerID="2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.671712 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gwvw" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.700784 4744 scope.go:117] "RemoveContainer" containerID="2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.725255 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.744019 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gwvw"] Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.758555 4744 scope.go:117] "RemoveContainer" containerID="6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.786763 4744 scope.go:117] "RemoveContainer" containerID="2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab" Mar 11 02:14:15 crc kubenswrapper[4744]: E0311 02:14:15.787309 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab\": container with ID starting with 2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab not found: ID does not exist" containerID="2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.787337 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab"} err="failed to get container status \"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab\": rpc error: code = NotFound desc = could not find container \"2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab\": container with ID starting with 2920b0eac8821076fcb12b8b572ab134d173b799cfaac9c4cf5e53ef8f370fab not found: ID does not exist" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.787357 4744 scope.go:117] "RemoveContainer" containerID="2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4" Mar 11 02:14:15 crc kubenswrapper[4744]: E0311 02:14:15.787959 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4\": container with ID starting with 2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4 not found: ID does not exist" containerID="2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.788019 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4"} err="failed to get container status \"2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4\": rpc error: code = NotFound desc = could not find container \"2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4\": container with ID starting with 2686055078adccff3c2dc08b046ea169e77b32c5cd821a1a4d02f3716be6c0c4 not found: ID does not exist" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.788058 4744 scope.go:117] "RemoveContainer" containerID="6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae" Mar 11 02:14:15 crc kubenswrapper[4744]: E0311 02:14:15.788390 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae\": container with ID starting with 6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae not found: ID does not exist" containerID="6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.788411 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae"} err="failed to get container status \"6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae\": rpc error: code = NotFound desc = could not find container \"6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae\": container with ID starting with 6ca0ad16e8eac05eae377964fc962b889c231f113b234d7d7a328e1f1b666aae not found: ID does not exist" Mar 11 02:14:15 crc kubenswrapper[4744]: I0311 02:14:15.991665 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" path="/var/lib/kubelet/pods/7a9a82e0-1bd6-4ee9-b71a-947542ea6730/volumes" Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.409367 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.410085 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.410154 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.411006 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.411098 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0" gracePeriod=600 Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.933621 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0" exitCode=0 Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.933696 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0"} Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.934198 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc"} Mar 11 02:14:42 crc kubenswrapper[4744]: I0311 02:14:42.934238 4744 scope.go:117] "RemoveContainer" containerID="b19c8ef67037a89700ac31d101d1a95194ef5d85b37d3ed88aad632d7215a921" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.649053 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:45 crc kubenswrapper[4744]: E0311 02:14:45.650305 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="extract-content" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650327 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="extract-content" Mar 11 02:14:45 crc kubenswrapper[4744]: E0311 02:14:45.650349 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8bcff3-14f9-4749-bf02-147262a8384c" containerName="oc" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650361 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8bcff3-14f9-4749-bf02-147262a8384c" containerName="oc" Mar 11 02:14:45 crc kubenswrapper[4744]: E0311 02:14:45.650379 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="extract-utilities" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650391 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="extract-utilities" Mar 11 02:14:45 crc kubenswrapper[4744]: E0311 02:14:45.650407 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="registry-server" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650419 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="registry-server" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650731 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8bcff3-14f9-4749-bf02-147262a8384c" containerName="oc" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.650771 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9a82e0-1bd6-4ee9-b71a-947542ea6730" containerName="registry-server" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.652569 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.685376 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.787507 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.787889 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrlt\" (UniqueName: \"kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.787970 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.840844 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.842258 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.866832 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.888967 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrlt\" (UniqueName: \"kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.889025 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.889072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.889565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.889840 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.931230 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrlt\" (UniqueName: \"kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt\") pod \"community-operators-8hv4d\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.985960 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.991095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.991153 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:45 crc kubenswrapper[4744]: I0311 02:14:45.991226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bhp\" (UniqueName: \"kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.092946 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.093038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.093089 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bhp\" (UniqueName: \"kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.094957 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.095339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.138310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bhp\" (UniqueName: \"kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp\") pod \"certified-operators-7zprh\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.166952 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.599781 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:46 crc kubenswrapper[4744]: W0311 02:14:46.607378 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66bd2f4e_6c51_4841_9e0f_c2c0deb9e0a6.slice/crio-0541e6bf59532af3df7fb665f019cf9ccebd8fa93f7b2f130dfc249edf2cb1ea WatchSource:0}: Error finding container 0541e6bf59532af3df7fb665f019cf9ccebd8fa93f7b2f130dfc249edf2cb1ea: Status 404 returned error can't find the container with id 0541e6bf59532af3df7fb665f019cf9ccebd8fa93f7b2f130dfc249edf2cb1ea Mar 11 02:14:46 crc kubenswrapper[4744]: I0311 02:14:46.679970 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:14:46 crc kubenswrapper[4744]: W0311 02:14:46.692794 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf36fdbf3_5b5d_446f_9aa3_9c19012b17d3.slice/crio-6f5348f354aa09e5e2a3f5f0b9cf09fcf5edf01fb31aa1800cb8f34703f960d5 WatchSource:0}: Error finding container 6f5348f354aa09e5e2a3f5f0b9cf09fcf5edf01fb31aa1800cb8f34703f960d5: Status 404 returned error can't find the container with id 6f5348f354aa09e5e2a3f5f0b9cf09fcf5edf01fb31aa1800cb8f34703f960d5 Mar 11 02:14:47 crc kubenswrapper[4744]: I0311 02:14:47.001817 4744 generic.go:334] "Generic (PLEG): container finished" podID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerID="8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470" exitCode=0 Mar 11 02:14:47 crc kubenswrapper[4744]: I0311 02:14:47.001967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerDied","Data":"8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470"} Mar 11 02:14:47 crc kubenswrapper[4744]: I0311 02:14:47.002433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerStarted","Data":"0541e6bf59532af3df7fb665f019cf9ccebd8fa93f7b2f130dfc249edf2cb1ea"} Mar 11 02:14:47 crc kubenswrapper[4744]: I0311 02:14:47.005284 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerStarted","Data":"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09"} Mar 11 02:14:47 crc kubenswrapper[4744]: I0311 02:14:47.005331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerStarted","Data":"6f5348f354aa09e5e2a3f5f0b9cf09fcf5edf01fb31aa1800cb8f34703f960d5"} Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.016659 4744 generic.go:334] "Generic (PLEG): container finished" podID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerID="bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09" exitCode=0 Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.016747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerDied","Data":"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09"} Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.848927 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.851963 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.866453 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.936130 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzbvk\" (UniqueName: \"kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.936174 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:48 crc kubenswrapper[4744]: I0311 02:14:48.936240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.026759 4744 generic.go:334] "Generic (PLEG): container finished" podID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerID="905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1" exitCode=0 Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.026844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerDied","Data":"905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1"} Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.030440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerStarted","Data":"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6"} Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.037286 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.037404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzbvk\" (UniqueName: \"kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.037430 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.037984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.039397 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.063051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzbvk\" (UniqueName: \"kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk\") pod \"redhat-operators-hgbzh\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.180560 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:49 crc kubenswrapper[4744]: I0311 02:14:49.604465 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:14:49 crc kubenswrapper[4744]: W0311 02:14:49.611800 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc336c4_4ff9_4d41_bfce_c1d369020563.slice/crio-d0c31c853916ed2e72a0410238726f7cc0d714d7deddd58fc123b135bbd9b510 WatchSource:0}: Error finding container d0c31c853916ed2e72a0410238726f7cc0d714d7deddd58fc123b135bbd9b510: Status 404 returned error can't find the container with id d0c31c853916ed2e72a0410238726f7cc0d714d7deddd58fc123b135bbd9b510 Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.048579 4744 generic.go:334] "Generic (PLEG): container finished" podID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerID="1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa" exitCode=0 Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.048633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerDied","Data":"1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa"} Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.048924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerStarted","Data":"d0c31c853916ed2e72a0410238726f7cc0d714d7deddd58fc123b135bbd9b510"} Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.051044 4744 generic.go:334] "Generic (PLEG): container finished" podID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerID="bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6" exitCode=0 Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.052891 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerDied","Data":"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6"} Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.056304 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerStarted","Data":"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2"} Mar 11 02:14:50 crc kubenswrapper[4744]: I0311 02:14:50.091199 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hv4d" podStartSLOduration=2.644752926 podStartE2EDuration="5.091184182s" podCreationTimestamp="2026-03-11 02:14:45 +0000 UTC" firstStartedPulling="2026-03-11 02:14:47.006021063 +0000 UTC m=+4843.810238678" lastFinishedPulling="2026-03-11 02:14:49.452452319 +0000 UTC m=+4846.256669934" observedRunningTime="2026-03-11 02:14:50.086317402 +0000 UTC m=+4846.890535007" watchObservedRunningTime="2026-03-11 02:14:50.091184182 +0000 UTC m=+4846.895401787" Mar 11 02:14:51 crc kubenswrapper[4744]: I0311 02:14:51.068544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerStarted","Data":"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b"} Mar 11 02:14:51 crc kubenswrapper[4744]: I0311 02:14:51.071181 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerStarted","Data":"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09"} Mar 11 02:14:51 crc kubenswrapper[4744]: I0311 02:14:51.087918 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zprh" podStartSLOduration=3.650399824 podStartE2EDuration="6.087903124s" podCreationTimestamp="2026-03-11 02:14:45 +0000 UTC" firstStartedPulling="2026-03-11 02:14:48.018912695 +0000 UTC m=+4844.823130330" lastFinishedPulling="2026-03-11 02:14:50.456416025 +0000 UTC m=+4847.260633630" observedRunningTime="2026-03-11 02:14:51.08549785 +0000 UTC m=+4847.889715455" watchObservedRunningTime="2026-03-11 02:14:51.087903124 +0000 UTC m=+4847.892120729" Mar 11 02:14:52 crc kubenswrapper[4744]: I0311 02:14:52.080953 4744 generic.go:334] "Generic (PLEG): container finished" podID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerID="7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09" exitCode=0 Mar 11 02:14:52 crc kubenswrapper[4744]: I0311 02:14:52.081012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerDied","Data":"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09"} Mar 11 02:14:53 crc kubenswrapper[4744]: I0311 02:14:53.090832 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerStarted","Data":"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda"} Mar 11 02:14:53 crc kubenswrapper[4744]: I0311 02:14:53.127032 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hgbzh" podStartSLOduration=2.610038855 podStartE2EDuration="5.127015568s" podCreationTimestamp="2026-03-11 02:14:48 +0000 UTC" firstStartedPulling="2026-03-11 02:14:50.050637161 +0000 UTC m=+4846.854854766" lastFinishedPulling="2026-03-11 02:14:52.567613834 +0000 UTC m=+4849.371831479" observedRunningTime="2026-03-11 02:14:53.123574562 +0000 UTC m=+4849.927792167" watchObservedRunningTime="2026-03-11 02:14:53.127015568 +0000 UTC m=+4849.931233173" Mar 11 02:14:55 crc kubenswrapper[4744]: I0311 02:14:55.989282 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:55 crc kubenswrapper[4744]: I0311 02:14:55.989628 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.047159 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.167462 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.167553 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.193421 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.256235 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:56 crc kubenswrapper[4744]: I0311 02:14:56.832366 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:57 crc kubenswrapper[4744]: I0311 02:14:57.199468 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:58 crc kubenswrapper[4744]: I0311 02:14:58.137962 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hv4d" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="registry-server" containerID="cri-o://48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2" gracePeriod=2 Mar 11 02:14:58 crc kubenswrapper[4744]: I0311 02:14:58.633733 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.105126 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.149630 4744 generic.go:334] "Generic (PLEG): container finished" podID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerID="48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2" exitCode=0 Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.149729 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hv4d" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.149821 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zprh" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="registry-server" containerID="cri-o://36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b" gracePeriod=2 Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.150010 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerDied","Data":"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2"} Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.155929 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hv4d" event={"ID":"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6","Type":"ContainerDied","Data":"0541e6bf59532af3df7fb665f019cf9ccebd8fa93f7b2f130dfc249edf2cb1ea"} Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.155981 4744 scope.go:117] "RemoveContainer" containerID="48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.181739 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.181784 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.182848 4744 scope.go:117] "RemoveContainer" containerID="905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.205446 4744 scope.go:117] "RemoveContainer" containerID="8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.235925 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities\") pod \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.236069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content\") pod \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.236104 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxrlt\" (UniqueName: \"kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt\") pod \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\" (UID: \"66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.237330 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities" (OuterVolumeSpecName: "utilities") pod "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" (UID: "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.243610 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt" (OuterVolumeSpecName: "kube-api-access-rxrlt") pod "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" (UID: "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6"). InnerVolumeSpecName "kube-api-access-rxrlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.295221 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" (UID: "66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.337402 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.337653 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.337666 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxrlt\" (UniqueName: \"kubernetes.io/projected/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6-kube-api-access-rxrlt\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.393278 4744 scope.go:117] "RemoveContainer" containerID="48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2" Mar 11 02:14:59 crc kubenswrapper[4744]: E0311 02:14:59.394065 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2\": container with ID starting with 48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2 not found: ID does not exist" containerID="48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.394150 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2"} err="failed to get container status \"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2\": rpc error: code = NotFound desc = could not find container \"48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2\": container with ID starting with 48e58fcdfb104a3ab85b98b8573c171277b94e061de5d8773d868e57178028f2 not found: ID does not exist" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.394204 4744 scope.go:117] "RemoveContainer" containerID="905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1" Mar 11 02:14:59 crc kubenswrapper[4744]: E0311 02:14:59.394760 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1\": container with ID starting with 905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1 not found: ID does not exist" containerID="905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.394800 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1"} err="failed to get container status \"905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1\": rpc error: code = NotFound desc = could not find container \"905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1\": container with ID starting with 905211ba1598ce9f9b73082ba0ab62e6c5ac136f81cec8e4f9c428221afb94d1 not found: ID does not exist" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.394823 4744 scope.go:117] "RemoveContainer" containerID="8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470" Mar 11 02:14:59 crc kubenswrapper[4744]: E0311 02:14:59.395166 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470\": container with ID starting with 8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470 not found: ID does not exist" containerID="8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.395229 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470"} err="failed to get container status \"8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470\": rpc error: code = NotFound desc = could not find container \"8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470\": container with ID starting with 8b6f786c5ad196e04c8abea38784a70229dd316a2f889973127f7046de03b470 not found: ID does not exist" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.492943 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.498931 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hv4d"] Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.556972 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.641660 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content\") pod \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.641704 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities\") pod \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.641766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66bhp\" (UniqueName: \"kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp\") pod \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\" (UID: \"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3\") " Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.642432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities" (OuterVolumeSpecName: "utilities") pod "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" (UID: "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.645458 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp" (OuterVolumeSpecName: "kube-api-access-66bhp") pod "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" (UID: "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3"). InnerVolumeSpecName "kube-api-access-66bhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.743087 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.743118 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66bhp\" (UniqueName: \"kubernetes.io/projected/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-kube-api-access-66bhp\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.825270 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" (UID: "f36fdbf3-5b5d-446f-9aa3-9c19012b17d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.844506 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:14:59 crc kubenswrapper[4744]: I0311 02:14:59.991204 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" path="/var/lib/kubelet/pods/66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6/volumes" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.168026 4744 generic.go:334] "Generic (PLEG): container finished" podID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerID="36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b" exitCode=0 Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.168133 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerDied","Data":"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b"} Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.168176 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zprh" event={"ID":"f36fdbf3-5b5d-446f-9aa3-9c19012b17d3","Type":"ContainerDied","Data":"6f5348f354aa09e5e2a3f5f0b9cf09fcf5edf01fb31aa1800cb8f34703f960d5"} Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.168204 4744 scope.go:117] "RemoveContainer" containerID="36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.168341 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zprh" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.175904 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4"] Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176611 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="extract-content" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176643 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="extract-content" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176670 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176686 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176724 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176738 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176757 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="extract-utilities" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176770 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="extract-utilities" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176802 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="extract-content" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176814 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="extract-content" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.176834 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="extract-utilities" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.176847 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="extract-utilities" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.177124 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bd2f4e-6c51-4841-9e0f-c2c0deb9e0a6" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.177160 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" containerName="registry-server" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.177940 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.182995 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.183451 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.198864 4744 scope.go:117] "RemoveContainer" containerID="bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.205145 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4"] Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.216264 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.233248 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zprh"] Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.241506 4744 scope.go:117] "RemoveContainer" containerID="bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.247159 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hgbzh" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="registry-server" probeResult="failure" output=< Mar 11 02:15:00 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 02:15:00 crc kubenswrapper[4744]: > Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.274823 4744 scope.go:117] "RemoveContainer" containerID="36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.275847 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b\": container with ID starting with 36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b not found: ID does not exist" containerID="36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.275930 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b"} err="failed to get container status \"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b\": rpc error: code = NotFound desc = could not find container \"36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b\": container with ID starting with 36dad5d4448edaf131769f164978e9be412c466fbdafa9eac43adabec685588b not found: ID does not exist" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.276014 4744 scope.go:117] "RemoveContainer" containerID="bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.276389 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6\": container with ID starting with bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6 not found: ID does not exist" containerID="bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.276468 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6"} err="failed to get container status \"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6\": rpc error: code = NotFound desc = could not find container \"bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6\": container with ID starting with bb4247cd1862ce006c783ba05ff078b571691209f1c3b7e62eeaa3b6ba6f04a6 not found: ID does not exist" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.276570 4744 scope.go:117] "RemoveContainer" containerID="bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09" Mar 11 02:15:00 crc kubenswrapper[4744]: E0311 02:15:00.276848 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09\": container with ID starting with bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09 not found: ID does not exist" containerID="bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.276916 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09"} err="failed to get container status \"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09\": rpc error: code = NotFound desc = could not find container \"bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09\": container with ID starting with bb8678c58f77ddae872a53fd441b3eb229a80a521b517889e1899c9039ee0a09 not found: ID does not exist" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.351605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzkq\" (UniqueName: \"kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.351698 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.351742 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.453483 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.453871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzkq\" (UniqueName: \"kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.454060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.454676 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.459591 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.485610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzkq\" (UniqueName: \"kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq\") pod \"collect-profiles-29553255-vqqp4\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.508311 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:00 crc kubenswrapper[4744]: W0311 02:15:00.817091 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod349bd0c6_8667_4bfd_a44c_9df3268b64a1.slice/crio-5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46 WatchSource:0}: Error finding container 5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46: Status 404 returned error can't find the container with id 5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46 Mar 11 02:15:00 crc kubenswrapper[4744]: I0311 02:15:00.827829 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4"] Mar 11 02:15:01 crc kubenswrapper[4744]: I0311 02:15:01.181257 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" event={"ID":"349bd0c6-8667-4bfd-a44c-9df3268b64a1","Type":"ContainerStarted","Data":"3d5c33723a52527c6f3bb3ddd0aa347b95e39db44276533af657120fcffa209e"} Mar 11 02:15:01 crc kubenswrapper[4744]: I0311 02:15:01.181307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" event={"ID":"349bd0c6-8667-4bfd-a44c-9df3268b64a1","Type":"ContainerStarted","Data":"5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46"} Mar 11 02:15:01 crc kubenswrapper[4744]: I0311 02:15:01.206945 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" podStartSLOduration=1.206926362 podStartE2EDuration="1.206926362s" podCreationTimestamp="2026-03-11 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:15:01.201088462 +0000 UTC m=+4858.005306087" watchObservedRunningTime="2026-03-11 02:15:01.206926362 +0000 UTC m=+4858.011143987" Mar 11 02:15:01 crc kubenswrapper[4744]: I0311 02:15:01.983783 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36fdbf3-5b5d-446f-9aa3-9c19012b17d3" path="/var/lib/kubelet/pods/f36fdbf3-5b5d-446f-9aa3-9c19012b17d3/volumes" Mar 11 02:15:02 crc kubenswrapper[4744]: I0311 02:15:02.356156 4744 generic.go:334] "Generic (PLEG): container finished" podID="349bd0c6-8667-4bfd-a44c-9df3268b64a1" containerID="3d5c33723a52527c6f3bb3ddd0aa347b95e39db44276533af657120fcffa209e" exitCode=0 Mar 11 02:15:02 crc kubenswrapper[4744]: I0311 02:15:02.356200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" event={"ID":"349bd0c6-8667-4bfd-a44c-9df3268b64a1","Type":"ContainerDied","Data":"3d5c33723a52527c6f3bb3ddd0aa347b95e39db44276533af657120fcffa209e"} Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.716987 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.749461 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume\") pod \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.749581 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzkq\" (UniqueName: \"kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq\") pod \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.749665 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume\") pod \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\" (UID: \"349bd0c6-8667-4bfd-a44c-9df3268b64a1\") " Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.750579 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume" (OuterVolumeSpecName: "config-volume") pod "349bd0c6-8667-4bfd-a44c-9df3268b64a1" (UID: "349bd0c6-8667-4bfd-a44c-9df3268b64a1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.750861 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/349bd0c6-8667-4bfd-a44c-9df3268b64a1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.758736 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq" (OuterVolumeSpecName: "kube-api-access-cvzkq") pod "349bd0c6-8667-4bfd-a44c-9df3268b64a1" (UID: "349bd0c6-8667-4bfd-a44c-9df3268b64a1"). InnerVolumeSpecName "kube-api-access-cvzkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.758748 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "349bd0c6-8667-4bfd-a44c-9df3268b64a1" (UID: "349bd0c6-8667-4bfd-a44c-9df3268b64a1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.852616 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/349bd0c6-8667-4bfd-a44c-9df3268b64a1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:03 crc kubenswrapper[4744]: I0311 02:15:03.852723 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzkq\" (UniqueName: \"kubernetes.io/projected/349bd0c6-8667-4bfd-a44c-9df3268b64a1-kube-api-access-cvzkq\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:04 crc kubenswrapper[4744]: I0311 02:15:04.294957 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm"] Mar 11 02:15:04 crc kubenswrapper[4744]: I0311 02:15:04.305318 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553210-6jzxm"] Mar 11 02:15:04 crc kubenswrapper[4744]: I0311 02:15:04.375871 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" event={"ID":"349bd0c6-8667-4bfd-a44c-9df3268b64a1","Type":"ContainerDied","Data":"5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46"} Mar 11 02:15:04 crc kubenswrapper[4744]: I0311 02:15:04.375975 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5840e576170bf5ca0ccce7d7f59720a0721b898e85e30147932f88259424ff46" Mar 11 02:15:04 crc kubenswrapper[4744]: I0311 02:15:04.375912 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553255-vqqp4" Mar 11 02:15:05 crc kubenswrapper[4744]: I0311 02:15:05.991628 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fad93b-fbcd-4de5-a08a-50d81b03f1c3" path="/var/lib/kubelet/pods/85fad93b-fbcd-4de5-a08a-50d81b03f1c3/volumes" Mar 11 02:15:09 crc kubenswrapper[4744]: I0311 02:15:09.536294 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:15:09 crc kubenswrapper[4744]: I0311 02:15:09.609840 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:15:09 crc kubenswrapper[4744]: I0311 02:15:09.789980 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.453321 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hgbzh" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="registry-server" containerID="cri-o://52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda" gracePeriod=2 Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.930092 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.974251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content\") pod \"3cc336c4-4ff9-4d41-bfce-c1d369020563\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.975047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities\") pod \"3cc336c4-4ff9-4d41-bfce-c1d369020563\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.975155 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzbvk\" (UniqueName: \"kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk\") pod \"3cc336c4-4ff9-4d41-bfce-c1d369020563\" (UID: \"3cc336c4-4ff9-4d41-bfce-c1d369020563\") " Mar 11 02:15:11 crc kubenswrapper[4744]: I0311 02:15:11.981490 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities" (OuterVolumeSpecName: "utilities") pod "3cc336c4-4ff9-4d41-bfce-c1d369020563" (UID: "3cc336c4-4ff9-4d41-bfce-c1d369020563"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.075374 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk" (OuterVolumeSpecName: "kube-api-access-nzbvk") pod "3cc336c4-4ff9-4d41-bfce-c1d369020563" (UID: "3cc336c4-4ff9-4d41-bfce-c1d369020563"). InnerVolumeSpecName "kube-api-access-nzbvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.077780 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.077872 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzbvk\" (UniqueName: \"kubernetes.io/projected/3cc336c4-4ff9-4d41-bfce-c1d369020563-kube-api-access-nzbvk\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.167697 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cc336c4-4ff9-4d41-bfce-c1d369020563" (UID: "3cc336c4-4ff9-4d41-bfce-c1d369020563"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.179628 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc336c4-4ff9-4d41-bfce-c1d369020563-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.464176 4744 generic.go:334] "Generic (PLEG): container finished" podID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerID="52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda" exitCode=0 Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.464233 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerDied","Data":"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda"} Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.464269 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgbzh" event={"ID":"3cc336c4-4ff9-4d41-bfce-c1d369020563","Type":"ContainerDied","Data":"d0c31c853916ed2e72a0410238726f7cc0d714d7deddd58fc123b135bbd9b510"} Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.464274 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgbzh" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.464291 4744 scope.go:117] "RemoveContainer" containerID="52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.497813 4744 scope.go:117] "RemoveContainer" containerID="7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.515773 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.531931 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hgbzh"] Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.538665 4744 scope.go:117] "RemoveContainer" containerID="1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.574779 4744 scope.go:117] "RemoveContainer" containerID="52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda" Mar 11 02:15:12 crc kubenswrapper[4744]: E0311 02:15:12.575754 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda\": container with ID starting with 52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda not found: ID does not exist" containerID="52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.575826 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda"} err="failed to get container status \"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda\": rpc error: code = NotFound desc = could not find container \"52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda\": container with ID starting with 52d012fbda5d3ec1308830094f52d23a816eb2e8798786f2a2708d487561fdda not found: ID does not exist" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.575876 4744 scope.go:117] "RemoveContainer" containerID="7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09" Mar 11 02:15:12 crc kubenswrapper[4744]: E0311 02:15:12.576546 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09\": container with ID starting with 7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09 not found: ID does not exist" containerID="7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.576646 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09"} err="failed to get container status \"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09\": rpc error: code = NotFound desc = could not find container \"7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09\": container with ID starting with 7079988634d8bf1a303b37b2edf5643a0dfb929ce5ebd250b54a3127bca21c09 not found: ID does not exist" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.576686 4744 scope.go:117] "RemoveContainer" containerID="1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa" Mar 11 02:15:12 crc kubenswrapper[4744]: E0311 02:15:12.577075 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa\": container with ID starting with 1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa not found: ID does not exist" containerID="1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa" Mar 11 02:15:12 crc kubenswrapper[4744]: I0311 02:15:12.577120 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa"} err="failed to get container status \"1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa\": rpc error: code = NotFound desc = could not find container \"1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa\": container with ID starting with 1cf9e774eba6704b255d77e4e09e85bf3c9515c7c56046a3e5adaeae5df021fa not found: ID does not exist" Mar 11 02:15:13 crc kubenswrapper[4744]: I0311 02:15:13.990784 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" path="/var/lib/kubelet/pods/3cc336c4-4ff9-4d41-bfce-c1d369020563/volumes" Mar 11 02:15:14 crc kubenswrapper[4744]: I0311 02:15:14.900718 4744 scope.go:117] "RemoveContainer" containerID="68f772e4c6ce843add7dac55131b742ad8eb2c2ab6bfafc3af5b4f00ec5575e1" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.818706 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lkdth"] Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.829620 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lkdth"] Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.948333 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gwzv2"] Mar 11 02:15:45 crc kubenswrapper[4744]: E0311 02:15:45.948831 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="registry-server" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.948855 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="registry-server" Mar 11 02:15:45 crc kubenswrapper[4744]: E0311 02:15:45.948875 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="extract-utilities" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.948889 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="extract-utilities" Mar 11 02:15:45 crc kubenswrapper[4744]: E0311 02:15:45.948907 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="extract-content" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.948919 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="extract-content" Mar 11 02:15:45 crc kubenswrapper[4744]: E0311 02:15:45.948955 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349bd0c6-8667-4bfd-a44c-9df3268b64a1" containerName="collect-profiles" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.948967 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="349bd0c6-8667-4bfd-a44c-9df3268b64a1" containerName="collect-profiles" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.949225 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="349bd0c6-8667-4bfd-a44c-9df3268b64a1" containerName="collect-profiles" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.949257 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cc336c4-4ff9-4d41-bfce-c1d369020563" containerName="registry-server" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.949987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.952790 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.953379 4744 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9g4m8" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.953603 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.954354 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.957976 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gwzv2"] Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.982632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lhr\" (UniqueName: \"kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.982787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.982836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:45 crc kubenswrapper[4744]: I0311 02:15:45.991883 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9829f27b-c482-450d-8e09-231b8b9943bc" path="/var/lib/kubelet/pods/9829f27b-c482-450d-8e09-231b8b9943bc/volumes" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.090620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.090674 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.090750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4lhr\" (UniqueName: \"kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.092028 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.092102 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.120773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4lhr\" (UniqueName: \"kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr\") pod \"crc-storage-crc-gwzv2\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.284397 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.580673 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gwzv2"] Mar 11 02:15:46 crc kubenswrapper[4744]: I0311 02:15:46.780044 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gwzv2" event={"ID":"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039","Type":"ContainerStarted","Data":"c7d4950a4686046d93247d7cb83771d9772c1c4f36935d46656c9db5ebc3472a"} Mar 11 02:15:47 crc kubenswrapper[4744]: I0311 02:15:47.804878 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gwzv2" event={"ID":"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039","Type":"ContainerStarted","Data":"0c5592d8b1416e02e0c169a4c882eed0f2828a0303926738ff99ec427bd73979"} Mar 11 02:15:47 crc kubenswrapper[4744]: I0311 02:15:47.832417 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-gwzv2" podStartSLOduration=2.312455652 podStartE2EDuration="2.83238943s" podCreationTimestamp="2026-03-11 02:15:45 +0000 UTC" firstStartedPulling="2026-03-11 02:15:46.592368579 +0000 UTC m=+4903.396586184" lastFinishedPulling="2026-03-11 02:15:47.112302327 +0000 UTC m=+4903.916519962" observedRunningTime="2026-03-11 02:15:47.825100916 +0000 UTC m=+4904.629318561" watchObservedRunningTime="2026-03-11 02:15:47.83238943 +0000 UTC m=+4904.636607075" Mar 11 02:15:48 crc kubenswrapper[4744]: I0311 02:15:48.816367 4744 generic.go:334] "Generic (PLEG): container finished" podID="1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" containerID="0c5592d8b1416e02e0c169a4c882eed0f2828a0303926738ff99ec427bd73979" exitCode=0 Mar 11 02:15:48 crc kubenswrapper[4744]: I0311 02:15:48.816426 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gwzv2" event={"ID":"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039","Type":"ContainerDied","Data":"0c5592d8b1416e02e0c169a4c882eed0f2828a0303926738ff99ec427bd73979"} Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.213740 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.256985 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4lhr\" (UniqueName: \"kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr\") pod \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.257037 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage\") pod \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.257066 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt\") pod \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\" (UID: \"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039\") " Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.257265 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" (UID: "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.266141 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr" (OuterVolumeSpecName: "kube-api-access-w4lhr") pod "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" (UID: "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039"). InnerVolumeSpecName "kube-api-access-w4lhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.296738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" (UID: "1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.358786 4744 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.358845 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4lhr\" (UniqueName: \"kubernetes.io/projected/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-kube-api-access-w4lhr\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.358865 4744 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.835459 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gwzv2" event={"ID":"1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039","Type":"ContainerDied","Data":"c7d4950a4686046d93247d7cb83771d9772c1c4f36935d46656c9db5ebc3472a"} Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.835820 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7d4950a4686046d93247d7cb83771d9772c1c4f36935d46656c9db5ebc3472a" Mar 11 02:15:50 crc kubenswrapper[4744]: I0311 02:15:50.835539 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gwzv2" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.109354 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gwzv2"] Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.118762 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gwzv2"] Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.237572 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-64xm4"] Mar 11 02:15:52 crc kubenswrapper[4744]: E0311 02:15:52.237919 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" containerName="storage" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.237940 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" containerName="storage" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.238112 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" containerName="storage" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.238654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.241897 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.241990 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.242151 4744 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-9g4m8" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.242163 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.262971 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-64xm4"] Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.395418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlffh\" (UniqueName: \"kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.395496 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.395607 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.497218 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlffh\" (UniqueName: \"kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.497299 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.497387 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.497773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.498615 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.531926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlffh\" (UniqueName: \"kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh\") pod \"crc-storage-crc-64xm4\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.565695 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:52 crc kubenswrapper[4744]: I0311 02:15:52.892394 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-64xm4"] Mar 11 02:15:53 crc kubenswrapper[4744]: I0311 02:15:53.874398 4744 generic.go:334] "Generic (PLEG): container finished" podID="fd8ab900-4d61-4d3a-8ff3-9819058acb9a" containerID="4bff39561bd823229ccdd5fbd5829ee11268881965c2b75faa285247a8672306" exitCode=0 Mar 11 02:15:53 crc kubenswrapper[4744]: I0311 02:15:53.874695 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-64xm4" event={"ID":"fd8ab900-4d61-4d3a-8ff3-9819058acb9a","Type":"ContainerDied","Data":"4bff39561bd823229ccdd5fbd5829ee11268881965c2b75faa285247a8672306"} Mar 11 02:15:53 crc kubenswrapper[4744]: I0311 02:15:53.874718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-64xm4" event={"ID":"fd8ab900-4d61-4d3a-8ff3-9819058acb9a","Type":"ContainerStarted","Data":"22141ca6456a5514face05e4dfdba10fbf3482a82322dc0c327bbf6c4da0220f"} Mar 11 02:15:54 crc kubenswrapper[4744]: I0311 02:15:54.006235 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039" path="/var/lib/kubelet/pods/1dc1a5a3-94a8-41b5-aa23-05a8cb4d4039/volumes" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.342419 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.453356 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage\") pod \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.453576 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlffh\" (UniqueName: \"kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh\") pod \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.453659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt\") pod \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\" (UID: \"fd8ab900-4d61-4d3a-8ff3-9819058acb9a\") " Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.453832 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fd8ab900-4d61-4d3a-8ff3-9819058acb9a" (UID: "fd8ab900-4d61-4d3a-8ff3-9819058acb9a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.454111 4744 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.461164 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh" (OuterVolumeSpecName: "kube-api-access-qlffh") pod "fd8ab900-4d61-4d3a-8ff3-9819058acb9a" (UID: "fd8ab900-4d61-4d3a-8ff3-9819058acb9a"). InnerVolumeSpecName "kube-api-access-qlffh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.483364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fd8ab900-4d61-4d3a-8ff3-9819058acb9a" (UID: "fd8ab900-4d61-4d3a-8ff3-9819058acb9a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.555556 4744 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.555610 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlffh\" (UniqueName: \"kubernetes.io/projected/fd8ab900-4d61-4d3a-8ff3-9819058acb9a-kube-api-access-qlffh\") on node \"crc\" DevicePath \"\"" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.905606 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-64xm4" event={"ID":"fd8ab900-4d61-4d3a-8ff3-9819058acb9a","Type":"ContainerDied","Data":"22141ca6456a5514face05e4dfdba10fbf3482a82322dc0c327bbf6c4da0220f"} Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.905668 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22141ca6456a5514face05e4dfdba10fbf3482a82322dc0c327bbf6c4da0220f" Mar 11 02:15:55 crc kubenswrapper[4744]: I0311 02:15:55.905751 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-64xm4" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.160505 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553256-bbwlt"] Mar 11 02:16:00 crc kubenswrapper[4744]: E0311 02:16:00.161576 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8ab900-4d61-4d3a-8ff3-9819058acb9a" containerName="storage" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.161597 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8ab900-4d61-4d3a-8ff3-9819058acb9a" containerName="storage" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.161929 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8ab900-4d61-4d3a-8ff3-9819058acb9a" containerName="storage" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.162635 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.166353 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.166413 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.166448 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.176421 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553256-bbwlt"] Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.234854 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dl76\" (UniqueName: \"kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76\") pod \"auto-csr-approver-29553256-bbwlt\" (UID: \"05dc404b-a4cb-49a5-a9e2-450c8ad160cb\") " pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.336596 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dl76\" (UniqueName: \"kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76\") pod \"auto-csr-approver-29553256-bbwlt\" (UID: \"05dc404b-a4cb-49a5-a9e2-450c8ad160cb\") " pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.364558 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dl76\" (UniqueName: \"kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76\") pod \"auto-csr-approver-29553256-bbwlt\" (UID: \"05dc404b-a4cb-49a5-a9e2-450c8ad160cb\") " pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.504174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:00 crc kubenswrapper[4744]: I0311 02:16:00.781822 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553256-bbwlt"] Mar 11 02:16:01 crc kubenswrapper[4744]: I0311 02:16:01.964092 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" event={"ID":"05dc404b-a4cb-49a5-a9e2-450c8ad160cb","Type":"ContainerStarted","Data":"facef15c2320a83824ebfcd5464698cf5f17959476fd030b24e623d6f9aa9613"} Mar 11 02:16:02 crc kubenswrapper[4744]: I0311 02:16:02.992314 4744 generic.go:334] "Generic (PLEG): container finished" podID="05dc404b-a4cb-49a5-a9e2-450c8ad160cb" containerID="ae988813cdc8239c3af7965e7641ef987de1b0eb7eb8655b2c604fcb26a9323e" exitCode=0 Mar 11 02:16:02 crc kubenswrapper[4744]: I0311 02:16:02.992665 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" event={"ID":"05dc404b-a4cb-49a5-a9e2-450c8ad160cb","Type":"ContainerDied","Data":"ae988813cdc8239c3af7965e7641ef987de1b0eb7eb8655b2c604fcb26a9323e"} Mar 11 02:16:04 crc kubenswrapper[4744]: I0311 02:16:04.406035 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:04 crc kubenswrapper[4744]: I0311 02:16:04.515663 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dl76\" (UniqueName: \"kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76\") pod \"05dc404b-a4cb-49a5-a9e2-450c8ad160cb\" (UID: \"05dc404b-a4cb-49a5-a9e2-450c8ad160cb\") " Mar 11 02:16:04 crc kubenswrapper[4744]: I0311 02:16:04.523790 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76" (OuterVolumeSpecName: "kube-api-access-6dl76") pod "05dc404b-a4cb-49a5-a9e2-450c8ad160cb" (UID: "05dc404b-a4cb-49a5-a9e2-450c8ad160cb"). InnerVolumeSpecName "kube-api-access-6dl76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:16:04 crc kubenswrapper[4744]: I0311 02:16:04.617482 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dl76\" (UniqueName: \"kubernetes.io/projected/05dc404b-a4cb-49a5-a9e2-450c8ad160cb-kube-api-access-6dl76\") on node \"crc\" DevicePath \"\"" Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.021035 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" event={"ID":"05dc404b-a4cb-49a5-a9e2-450c8ad160cb","Type":"ContainerDied","Data":"facef15c2320a83824ebfcd5464698cf5f17959476fd030b24e623d6f9aa9613"} Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.021399 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="facef15c2320a83824ebfcd5464698cf5f17959476fd030b24e623d6f9aa9613" Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.021474 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553256-bbwlt" Mar 11 02:16:05 crc kubenswrapper[4744]: E0311 02:16:05.203010 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05dc404b_a4cb_49a5_a9e2_450c8ad160cb.slice/crio-facef15c2320a83824ebfcd5464698cf5f17959476fd030b24e623d6f9aa9613\": RecentStats: unable to find data in memory cache]" Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.476966 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553250-bmnkn"] Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.514719 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553250-bmnkn"] Mar 11 02:16:05 crc kubenswrapper[4744]: I0311 02:16:05.992375 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a213726e-9cd5-4470-8725-769648f3002c" path="/var/lib/kubelet/pods/a213726e-9cd5-4470-8725-769648f3002c/volumes" Mar 11 02:16:15 crc kubenswrapper[4744]: I0311 02:16:15.017150 4744 scope.go:117] "RemoveContainer" containerID="0ddbe5e315d6627cf58fa17ebf3562702027a0344dd287d5bebdda5faba1638a" Mar 11 02:16:15 crc kubenswrapper[4744]: I0311 02:16:15.096406 4744 scope.go:117] "RemoveContainer" containerID="e8e7de9821f1f2395a016ec5d88815ec81f2044234a92bc5f54ebe8a619a2f2e" Mar 11 02:16:42 crc kubenswrapper[4744]: I0311 02:16:42.408946 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:16:42 crc kubenswrapper[4744]: I0311 02:16:42.409610 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:17:12 crc kubenswrapper[4744]: I0311 02:17:12.417291 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:17:12 crc kubenswrapper[4744]: I0311 02:17:12.419745 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.409167 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.409934 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.410017 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.410945 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.411063 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" gracePeriod=600 Mar 11 02:17:42 crc kubenswrapper[4744]: E0311 02:17:42.544365 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.925218 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" exitCode=0 Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.925290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc"} Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.925363 4744 scope.go:117] "RemoveContainer" containerID="e4c2aaf09ea940efd9719ce3215dbea5518811d2ea206c3c037ab368ae850bc0" Mar 11 02:17:42 crc kubenswrapper[4744]: I0311 02:17:42.926130 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:17:42 crc kubenswrapper[4744]: E0311 02:17:42.926690 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:17:54 crc kubenswrapper[4744]: I0311 02:17:54.975197 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:17:54 crc kubenswrapper[4744]: E0311 02:17:54.977279 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.170007 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553258-fkvdg"] Mar 11 02:18:00 crc kubenswrapper[4744]: E0311 02:18:00.171154 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dc404b-a4cb-49a5-a9e2-450c8ad160cb" containerName="oc" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.171177 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dc404b-a4cb-49a5-a9e2-450c8ad160cb" containerName="oc" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.171412 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dc404b-a4cb-49a5-a9e2-450c8ad160cb" containerName="oc" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.172120 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.175728 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.175864 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.176140 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.187179 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553258-fkvdg"] Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.341646 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bvt\" (UniqueName: \"kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt\") pod \"auto-csr-approver-29553258-fkvdg\" (UID: \"d6ca5b5c-1076-451e-965c-62cf0ba58592\") " pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.443949 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bvt\" (UniqueName: \"kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt\") pod \"auto-csr-approver-29553258-fkvdg\" (UID: \"d6ca5b5c-1076-451e-965c-62cf0ba58592\") " pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.481348 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bvt\" (UniqueName: \"kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt\") pod \"auto-csr-approver-29553258-fkvdg\" (UID: \"d6ca5b5c-1076-451e-965c-62cf0ba58592\") " pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:00 crc kubenswrapper[4744]: I0311 02:18:00.500519 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:01 crc kubenswrapper[4744]: I0311 02:18:01.024963 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553258-fkvdg"] Mar 11 02:18:01 crc kubenswrapper[4744]: I0311 02:18:01.137020 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" event={"ID":"d6ca5b5c-1076-451e-965c-62cf0ba58592","Type":"ContainerStarted","Data":"d0ed68766f65c376708fedcb545bdc3a60d2c6bd1e5f7e8f8514019df52fe551"} Mar 11 02:18:03 crc kubenswrapper[4744]: I0311 02:18:03.159698 4744 generic.go:334] "Generic (PLEG): container finished" podID="d6ca5b5c-1076-451e-965c-62cf0ba58592" containerID="24928f6076d279cd331d498bcbc05cd86f603a9a110faedb11c70c611144b9b5" exitCode=0 Mar 11 02:18:03 crc kubenswrapper[4744]: I0311 02:18:03.159990 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" event={"ID":"d6ca5b5c-1076-451e-965c-62cf0ba58592","Type":"ContainerDied","Data":"24928f6076d279cd331d498bcbc05cd86f603a9a110faedb11c70c611144b9b5"} Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.015526 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.027711 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.030292 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fckrw" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.031491 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.036369 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.036490 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.045245 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.046464 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.048387 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.048464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.070007 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.108176 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhhk\" (UniqueName: \"kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.108246 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.209540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.209601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.209642 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tfnm\" (UniqueName: \"kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.209671 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhhk\" (UniqueName: \"kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.209687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.210567 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.230486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhhk\" (UniqueName: \"kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk\") pod \"dnsmasq-dns-c44667757-qq98n\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.311132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tfnm\" (UniqueName: \"kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.311188 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.311257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.312029 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.312113 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.345504 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.349295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tfnm\" (UniqueName: \"kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm\") pod \"dnsmasq-dns-55c76fd6b7-mzqkh\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.387275 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.404791 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.441149 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.442242 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.464154 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.614063 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.615260 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwm2m\" (UniqueName: \"kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.615338 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.615389 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.716250 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2bvt\" (UniqueName: \"kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt\") pod \"d6ca5b5c-1076-451e-965c-62cf0ba58592\" (UID: \"d6ca5b5c-1076-451e-965c-62cf0ba58592\") " Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.716535 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.716611 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.716642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwm2m\" (UniqueName: \"kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.718926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.719635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.722926 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt" (OuterVolumeSpecName: "kube-api-access-p2bvt") pod "d6ca5b5c-1076-451e-965c-62cf0ba58592" (UID: "d6ca5b5c-1076-451e-965c-62cf0ba58592"). InnerVolumeSpecName "kube-api-access-p2bvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.740313 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwm2m\" (UniqueName: \"kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m\") pod \"dnsmasq-dns-5fb77f9685-nm4xr\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.757293 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.803327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.817555 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.823033 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2bvt\" (UniqueName: \"kubernetes.io/projected/d6ca5b5c-1076-451e-965c-62cf0ba58592-kube-api-access-p2bvt\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.868571 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:18:04 crc kubenswrapper[4744]: E0311 02:18:04.868907 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ca5b5c-1076-451e-965c-62cf0ba58592" containerName="oc" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.868936 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ca5b5c-1076-451e-965c-62cf0ba58592" containerName="oc" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.869062 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ca5b5c-1076-451e-965c-62cf0ba58592" containerName="oc" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.869786 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:04 crc kubenswrapper[4744]: I0311 02:18:04.879290 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.010407 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.025754 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.025811 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.025840 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjms\" (UniqueName: \"kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.127022 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.127095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjms\" (UniqueName: \"kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.127745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.128201 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.128311 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.182044 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" event={"ID":"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c","Type":"ContainerStarted","Data":"47a69d6021af0ecbf8a8892d781791026b1c0ad42cb682bdb00ae65c01f58a7f"} Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.183246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" event={"ID":"d6ca5b5c-1076-451e-965c-62cf0ba58592","Type":"ContainerDied","Data":"d0ed68766f65c376708fedcb545bdc3a60d2c6bd1e5f7e8f8514019df52fe551"} Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.183283 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553258-fkvdg" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.183301 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ed68766f65c376708fedcb545bdc3a60d2c6bd1e5f7e8f8514019df52fe551" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.276421 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjms\" (UniqueName: \"kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms\") pod \"dnsmasq-dns-ff89b6977-xqrvm\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: W0311 02:18:05.278010 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d4298c0_6fc4_408e_9c4e_229479486dfe.slice/crio-f2f3e90b90a8f12d6cfc0f49f46b784cd0e3e7e9c80af440e4f4a27d3fb5aabc WatchSource:0}: Error finding container f2f3e90b90a8f12d6cfc0f49f46b784cd0e3e7e9c80af440e4f4a27d3fb5aabc: Status 404 returned error can't find the container with id f2f3e90b90a8f12d6cfc0f49f46b784cd0e3e7e9c80af440e4f4a27d3fb5aabc Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.298560 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.488298 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.597717 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.600174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.602288 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.602671 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.602819 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.602963 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.603252 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.603424 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.603607 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t9pqx" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.617817 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.692801 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553252-jfhmn"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.697633 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553252-jfhmn"] Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736267 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736308 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736497 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736598 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxnz8\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736629 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736686 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.736886 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxnz8\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838455 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838638 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.838783 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.840367 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.840978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.841184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.841333 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.842160 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.845082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.846626 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.847708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.856249 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxnz8\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.856905 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.857540 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.857566 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5b63ef380139cc0902f31bd0b7ac4425ff7a05031933db8e86fe68bd3f42ee7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.886549 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.915250 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:05 crc kubenswrapper[4744]: I0311 02:18:05.994654 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee1e625-6b01-487f-8462-104afacd05e7" path="/var/lib/kubelet/pods/5ee1e625-6b01-487f-8462-104afacd05e7/volumes" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.003548 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.004645 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.008005 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.008234 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.008331 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.008484 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.011725 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9xkfn" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.011918 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.012029 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.034149 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.091231 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.144470 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.144733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.144788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145280 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145346 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145372 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145398 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145416 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhbc\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.145457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.195634 4744 generic.go:334] "Generic (PLEG): container finished" podID="c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" containerID="b080bfc35df75ed87ff3f3d78ae63b7fef9983d3837294065c92b314b4daac63" exitCode=0 Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.195772 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" event={"ID":"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c","Type":"ContainerDied","Data":"b080bfc35df75ed87ff3f3d78ae63b7fef9983d3837294065c92b314b4daac63"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.197121 4744 generic.go:334] "Generic (PLEG): container finished" podID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerID="441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111" exitCode=0 Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.197173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" event={"ID":"defdeafa-942d-4432-bb2c-c8e1176ce936","Type":"ContainerDied","Data":"441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.197198 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" event={"ID":"defdeafa-942d-4432-bb2c-c8e1176ce936","Type":"ContainerStarted","Data":"4802088f2b376b4b381360cf3f4a2856a37629c174f37040e123f6c55244b866"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.200093 4744 generic.go:334] "Generic (PLEG): container finished" podID="7d4298c0-6fc4-408e-9c4e-229479486dfe" containerID="4f6b7aad4bb6afb342d10d16de64414212194c32dc350de4ce1961c0708274fc" exitCode=0 Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.200160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-qq98n" event={"ID":"7d4298c0-6fc4-408e-9c4e-229479486dfe","Type":"ContainerDied","Data":"4f6b7aad4bb6afb342d10d16de64414212194c32dc350de4ce1961c0708274fc"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.200189 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-qq98n" event={"ID":"7d4298c0-6fc4-408e-9c4e-229479486dfe","Type":"ContainerStarted","Data":"f2f3e90b90a8f12d6cfc0f49f46b784cd0e3e7e9c80af440e4f4a27d3fb5aabc"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.206672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" event={"ID":"8b1f9328-7210-412e-9d4d-1d1a8b6804dc","Type":"ContainerStarted","Data":"7e3311e9d9687fdc7517741101acc8e92512d0e044900204870cb3d0a7b1d43d"} Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247707 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247775 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247808 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhbc\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247830 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247861 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.247921 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.249591 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.255496 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.255982 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.256085 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.256177 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.256874 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.265184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.268744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.276310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.279669 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.279730 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b75736b3331f603a9da86ac6aa87ca334c35162c4bfbab228d172c616bad6b4f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.282637 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhbc\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.315068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.404250 4744 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 11 02:18:06 crc kubenswrapper[4744]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/defdeafa-942d-4432-bb2c-c8e1176ce936/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 02:18:06 crc kubenswrapper[4744]: > podSandboxID="4802088f2b376b4b381360cf3f4a2856a37629c174f37040e123f6c55244b866" Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.404408 4744 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 02:18:06 crc kubenswrapper[4744]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6hc5h68h68h594h659hdbh679h65ch5f6hdch6h5b9h8fh55hfhf8h57fhc7h56ch687h669h559h678h5dhc7hf7h697h5d6h9ch669h54fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwm2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5fb77f9685-nm4xr_openstack(defdeafa-942d-4432-bb2c-c8e1176ce936): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/defdeafa-942d-4432-bb2c-c8e1176ce936/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 02:18:06 crc kubenswrapper[4744]: > logger="UnhandledError" Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.405577 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/defdeafa-942d-4432-bb2c-c8e1176ce936/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.547102 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.554680 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.565104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.628651 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.629030 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4298c0-6fc4-408e-9c4e-229479486dfe" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.629094 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4298c0-6fc4-408e-9c4e-229479486dfe" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.629154 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.629202 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.629371 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4298c0-6fc4-408e-9c4e-229479486dfe" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.629441 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" containerName="init" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.630149 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.630318 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.632050 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kvs98" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.632943 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.633473 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.633654 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.647428 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.651264 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.651713 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhhk\" (UniqueName: \"kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk\") pod \"7d4298c0-6fc4-408e-9c4e-229479486dfe\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.651773 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config\") pod \"7d4298c0-6fc4-408e-9c4e-229479486dfe\" (UID: \"7d4298c0-6fc4-408e-9c4e-229479486dfe\") " Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.651977 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc\") pod \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.652052 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tfnm\" (UniqueName: \"kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm\") pod \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.652096 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config\") pod \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\" (UID: \"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c\") " Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.657381 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk" (OuterVolumeSpecName: "kube-api-access-7xhhk") pod "7d4298c0-6fc4-408e-9c4e-229479486dfe" (UID: "7d4298c0-6fc4-408e-9c4e-229479486dfe"). InnerVolumeSpecName "kube-api-access-7xhhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.657732 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm" (OuterVolumeSpecName: "kube-api-access-5tfnm") pod "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" (UID: "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c"). InnerVolumeSpecName "kube-api-access-5tfnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.676615 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config" (OuterVolumeSpecName: "config") pod "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" (UID: "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.679062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" (UID: "c2b128c1-8690-4f5b-8d3b-b7bb2545b79c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.680416 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config" (OuterVolumeSpecName: "config") pod "7d4298c0-6fc4-408e-9c4e-229479486dfe" (UID: "7d4298c0-6fc4-408e-9c4e-229479486dfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774233 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774277 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4f48\" (UniqueName: \"kubernetes.io/projected/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kube-api-access-f4f48\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774335 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774358 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774394 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774419 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774464 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774584 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774597 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tfnm\" (UniqueName: \"kubernetes.io/projected/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-kube-api-access-5tfnm\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774609 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774620 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhhk\" (UniqueName: \"kubernetes.io/projected/7d4298c0-6fc4-408e-9c4e-229479486dfe-kube-api-access-7xhhk\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.774630 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4298c0-6fc4-408e-9c4e-229479486dfe-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.875471 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.875976 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.876046 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.876485 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.876623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.877084 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.877136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.877188 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.877218 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4f48\" (UniqueName: \"kubernetes.io/projected/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kube-api-access-f4f48\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.877902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kolla-config\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.878420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-config-data-default\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.879328 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.879908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.880010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.883296 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.883344 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/47e81184887632cb6d0105edff3dafc1f7d45e25d269a63a9004dfeb5ec88c85/globalmount\"" pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.898231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4f48\" (UniqueName: \"kubernetes.io/projected/a6e19d5d-20f5-4836-afcc-a5958a01bbf2-kube-api-access-f4f48\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.928408 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8535954f-8479-4dd5-9de3-262e1e6b7235\") pod \"openstack-galera-0\" (UID: \"a6e19d5d-20f5-4836-afcc-a5958a01bbf2\") " pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.975085 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 02:18:06 crc kubenswrapper[4744]: I0311 02:18:06.976016 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:18:06 crc kubenswrapper[4744]: E0311 02:18:06.976748 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.231089 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-qq98n" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.231776 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-qq98n" event={"ID":"7d4298c0-6fc4-408e-9c4e-229479486dfe","Type":"ContainerDied","Data":"f2f3e90b90a8f12d6cfc0f49f46b784cd0e3e7e9c80af440e4f4a27d3fb5aabc"} Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.231834 4744 scope.go:117] "RemoveContainer" containerID="4f6b7aad4bb6afb342d10d16de64414212194c32dc350de4ce1961c0708274fc" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.255343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerStarted","Data":"bca684a2b7d25d39eca2a9206a406028bf0332bff0b1e3b8f6ea882999ee9993"} Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.270598 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" event={"ID":"8b1f9328-7210-412e-9d4d-1d1a8b6804dc","Type":"ContainerDied","Data":"6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c"} Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.269224 4744 generic.go:334] "Generic (PLEG): container finished" podID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerID="6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c" exitCode=0 Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.285478 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.285900 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-mzqkh" event={"ID":"c2b128c1-8690-4f5b-8d3b-b7bb2545b79c","Type":"ContainerDied","Data":"47a69d6021af0ecbf8a8892d781791026b1c0ad42cb682bdb00ae65c01f58a7f"} Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.287629 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.418827 4744 scope.go:117] "RemoveContainer" containerID="b080bfc35df75ed87ff3f3d78ae63b7fef9983d3837294065c92b314b4daac63" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.475571 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.510991 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-mzqkh"] Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.542146 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.550141 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-qq98n"] Mar 11 02:18:07 crc kubenswrapper[4744]: W0311 02:18:07.635971 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6e19d5d_20f5_4836_afcc_a5958a01bbf2.slice/crio-6b3fec3c3e861ea5a0cc68c5324a3f606779400d5f4c3b0a7611fdb6a9d94cda WatchSource:0}: Error finding container 6b3fec3c3e861ea5a0cc68c5324a3f606779400d5f4c3b0a7611fdb6a9d94cda: Status 404 returned error can't find the container with id 6b3fec3c3e861ea5a0cc68c5324a3f606779400d5f4c3b0a7611fdb6a9d94cda Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.639252 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.991261 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4298c0-6fc4-408e-9c4e-229479486dfe" path="/var/lib/kubelet/pods/7d4298c0-6fc4-408e-9c4e-229479486dfe/volumes" Mar 11 02:18:07 crc kubenswrapper[4744]: I0311 02:18:07.993828 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b128c1-8690-4f5b-8d3b-b7bb2545b79c" path="/var/lib/kubelet/pods/c2b128c1-8690-4f5b-8d3b-b7bb2545b79c/volumes" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.152741 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.154434 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.156781 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vp7bx" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.158778 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.159193 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.159233 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.215161 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.294397 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerStarted","Data":"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.296212 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" event={"ID":"8b1f9328-7210-412e-9d4d-1d1a8b6804dc","Type":"ContainerStarted","Data":"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.296416 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.299411 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" event={"ID":"defdeafa-942d-4432-bb2c-c8e1176ce936","Type":"ContainerStarted","Data":"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.299856 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.301194 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6e19d5d-20f5-4836-afcc-a5958a01bbf2","Type":"ContainerStarted","Data":"80d9dfe14690a58e1f27ec3b945b74cda1edf4f7ac6a252534d472b22b200d29"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.301298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6e19d5d-20f5-4836-afcc-a5958a01bbf2","Type":"ContainerStarted","Data":"6b3fec3c3e861ea5a0cc68c5324a3f606779400d5f4c3b0a7611fdb6a9d94cda"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.301964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerStarted","Data":"7826fcb383701dd15de9f51444f013455c812635534265e63dc2fb9e43d27c8d"} Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.307808 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99wn\" (UniqueName: \"kubernetes.io/projected/776b5477-2a1c-4938-9f48-c165db85c160-kube-api-access-w99wn\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.307853 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.307899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.307935 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.307972 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.308001 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.308023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776b5477-2a1c-4938-9f48-c165db85c160-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.308041 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.387958 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" podStartSLOduration=4.387940776 podStartE2EDuration="4.387940776s" podCreationTimestamp="2026-03-11 02:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:08.359367395 +0000 UTC m=+5045.163585000" watchObservedRunningTime="2026-03-11 02:18:08.387940776 +0000 UTC m=+5045.192158381" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.392814 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" podStartSLOduration=4.392779846 podStartE2EDuration="4.392779846s" podCreationTimestamp="2026-03-11 02:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:08.386490402 +0000 UTC m=+5045.190708007" watchObservedRunningTime="2026-03-11 02:18:08.392779846 +0000 UTC m=+5045.196997451" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.409080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.409343 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99wn\" (UniqueName: \"kubernetes.io/projected/776b5477-2a1c-4938-9f48-c165db85c160-kube-api-access-w99wn\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.409419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410505 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410770 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410916 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.410968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776b5477-2a1c-4938-9f48-c165db85c160-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.411926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.412620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776b5477-2a1c-4938-9f48-c165db85c160-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.412701 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776b5477-2a1c-4938-9f48-c165db85c160-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.417159 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.424588 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776b5477-2a1c-4938-9f48-c165db85c160-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.426145 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.426184 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e6aada9bb964cdeef8d60dfba3b6912ea1222f71e8caa2484e97f636934e930c/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.440136 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99wn\" (UniqueName: \"kubernetes.io/projected/776b5477-2a1c-4938-9f48-c165db85c160-kube-api-access-w99wn\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.499155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e26b9545-04f1-4f37-a685-cf63ea6a34ab\") pod \"openstack-cell1-galera-0\" (UID: \"776b5477-2a1c-4938-9f48-c165db85c160\") " pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.518109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.708262 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.709398 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.712967 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.712975 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.713342 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bkgnx" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.731703 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.817701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82fl\" (UniqueName: \"kubernetes.io/projected/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kube-api-access-c82fl\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.817913 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.817942 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.817998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kolla-config\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.818039 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-config-data\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.920051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.920129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.920223 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kolla-config\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.920312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-config-data\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.920499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c82fl\" (UniqueName: \"kubernetes.io/projected/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kube-api-access-c82fl\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.922772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-config-data\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.922908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kolla-config\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.924850 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.926146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1383c00-c842-43c0-a8c1-c81c615b6e8e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.943438 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82fl\" (UniqueName: \"kubernetes.io/projected/c1383c00-c842-43c0-a8c1-c81c615b6e8e-kube-api-access-c82fl\") pod \"memcached-0\" (UID: \"c1383c00-c842-43c0-a8c1-c81c615b6e8e\") " pod="openstack/memcached-0" Mar 11 02:18:08 crc kubenswrapper[4744]: I0311 02:18:08.968954 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 02:18:09 crc kubenswrapper[4744]: I0311 02:18:09.039413 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 02:18:09 crc kubenswrapper[4744]: I0311 02:18:09.311969 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776b5477-2a1c-4938-9f48-c165db85c160","Type":"ContainerStarted","Data":"3f083960b990af224852992ac780801e27b15e975cd239501b3a7113aad98de9"} Mar 11 02:18:09 crc kubenswrapper[4744]: I0311 02:18:09.312294 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776b5477-2a1c-4938-9f48-c165db85c160","Type":"ContainerStarted","Data":"afd2fce38a4749cd32f3643131dd80285b143eb023d0fc588d5fd331902e019b"} Mar 11 02:18:09 crc kubenswrapper[4744]: I0311 02:18:09.313444 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerStarted","Data":"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca"} Mar 11 02:18:09 crc kubenswrapper[4744]: W0311 02:18:09.463583 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1383c00_c842_43c0_a8c1_c81c615b6e8e.slice/crio-9913de712e5a16a8f143a448cc730fdc55b76a1fa48b7111b73ce5525f596d47 WatchSource:0}: Error finding container 9913de712e5a16a8f143a448cc730fdc55b76a1fa48b7111b73ce5525f596d47: Status 404 returned error can't find the container with id 9913de712e5a16a8f143a448cc730fdc55b76a1fa48b7111b73ce5525f596d47 Mar 11 02:18:09 crc kubenswrapper[4744]: I0311 02:18:09.479489 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 02:18:10 crc kubenswrapper[4744]: I0311 02:18:10.326286 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1383c00-c842-43c0-a8c1-c81c615b6e8e","Type":"ContainerStarted","Data":"ba6b5167fe9b5484a2239c9ec1278ba6e07b0e2ff98c72e780d9d4c829364f4d"} Mar 11 02:18:10 crc kubenswrapper[4744]: I0311 02:18:10.326620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1383c00-c842-43c0-a8c1-c81c615b6e8e","Type":"ContainerStarted","Data":"9913de712e5a16a8f143a448cc730fdc55b76a1fa48b7111b73ce5525f596d47"} Mar 11 02:18:10 crc kubenswrapper[4744]: I0311 02:18:10.328575 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 11 02:18:10 crc kubenswrapper[4744]: I0311 02:18:10.359290 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.359254138 podStartE2EDuration="2.359254138s" podCreationTimestamp="2026-03-11 02:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:10.350471867 +0000 UTC m=+5047.154689532" watchObservedRunningTime="2026-03-11 02:18:10.359254138 +0000 UTC m=+5047.163471793" Mar 11 02:18:12 crc kubenswrapper[4744]: I0311 02:18:12.352049 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6e19d5d-20f5-4836-afcc-a5958a01bbf2" containerID="80d9dfe14690a58e1f27ec3b945b74cda1edf4f7ac6a252534d472b22b200d29" exitCode=0 Mar 11 02:18:12 crc kubenswrapper[4744]: I0311 02:18:12.352188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6e19d5d-20f5-4836-afcc-a5958a01bbf2","Type":"ContainerDied","Data":"80d9dfe14690a58e1f27ec3b945b74cda1edf4f7ac6a252534d472b22b200d29"} Mar 11 02:18:13 crc kubenswrapper[4744]: I0311 02:18:13.365780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a6e19d5d-20f5-4836-afcc-a5958a01bbf2","Type":"ContainerStarted","Data":"7d831950fe541bf6b340eec1298ef9e4231f76f42c3ac40bfb56e2519325fa3c"} Mar 11 02:18:13 crc kubenswrapper[4744]: I0311 02:18:13.368118 4744 generic.go:334] "Generic (PLEG): container finished" podID="776b5477-2a1c-4938-9f48-c165db85c160" containerID="3f083960b990af224852992ac780801e27b15e975cd239501b3a7113aad98de9" exitCode=0 Mar 11 02:18:13 crc kubenswrapper[4744]: I0311 02:18:13.368186 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776b5477-2a1c-4938-9f48-c165db85c160","Type":"ContainerDied","Data":"3f083960b990af224852992ac780801e27b15e975cd239501b3a7113aad98de9"} Mar 11 02:18:13 crc kubenswrapper[4744]: I0311 02:18:13.405189 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.405160545 podStartE2EDuration="8.405160545s" podCreationTimestamp="2026-03-11 02:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:13.398764237 +0000 UTC m=+5050.202981872" watchObservedRunningTime="2026-03-11 02:18:13.405160545 +0000 UTC m=+5050.209378180" Mar 11 02:18:14 crc kubenswrapper[4744]: I0311 02:18:14.042718 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 11 02:18:14 crc kubenswrapper[4744]: I0311 02:18:14.377229 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776b5477-2a1c-4938-9f48-c165db85c160","Type":"ContainerStarted","Data":"b3e5ffbb3d23247426e3e7c34f4ec740e5f76d87d4affbe8d2d8ea4bc32adaf1"} Mar 11 02:18:14 crc kubenswrapper[4744]: I0311 02:18:14.406031 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.406001985 podStartE2EDuration="7.406001985s" podCreationTimestamp="2026-03-11 02:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:14.398910985 +0000 UTC m=+5051.203128600" watchObservedRunningTime="2026-03-11 02:18:14.406001985 +0000 UTC m=+5051.210219600" Mar 11 02:18:14 crc kubenswrapper[4744]: I0311 02:18:14.805802 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:15 crc kubenswrapper[4744]: I0311 02:18:15.206982 4744 scope.go:117] "RemoveContainer" containerID="8c3928e4ec7e5bc9173bdc1f879fb35a19a9ce81fdb67bfc6438f584fd72ac27" Mar 11 02:18:15 crc kubenswrapper[4744]: I0311 02:18:15.490548 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:18:15 crc kubenswrapper[4744]: I0311 02:18:15.549340 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:15 crc kubenswrapper[4744]: I0311 02:18:15.549614 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="dnsmasq-dns" containerID="cri-o://bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b" gracePeriod=10 Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.297690 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.393996 4744 generic.go:334] "Generic (PLEG): container finished" podID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerID="bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b" exitCode=0 Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.394047 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" event={"ID":"defdeafa-942d-4432-bb2c-c8e1176ce936","Type":"ContainerDied","Data":"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b"} Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.394080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" event={"ID":"defdeafa-942d-4432-bb2c-c8e1176ce936","Type":"ContainerDied","Data":"4802088f2b376b4b381360cf3f4a2856a37629c174f37040e123f6c55244b866"} Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.394103 4744 scope.go:117] "RemoveContainer" containerID="bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.394109 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-nm4xr" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.414066 4744 scope.go:117] "RemoveContainer" containerID="441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441291 4744 scope.go:117] "RemoveContainer" containerID="bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc\") pod \"defdeafa-942d-4432-bb2c-c8e1176ce936\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441690 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwm2m\" (UniqueName: \"kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m\") pod \"defdeafa-942d-4432-bb2c-c8e1176ce936\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441749 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config\") pod \"defdeafa-942d-4432-bb2c-c8e1176ce936\" (UID: \"defdeafa-942d-4432-bb2c-c8e1176ce936\") " Mar 11 02:18:16 crc kubenswrapper[4744]: E0311 02:18:16.441814 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b\": container with ID starting with bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b not found: ID does not exist" containerID="bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441851 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b"} err="failed to get container status \"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b\": rpc error: code = NotFound desc = could not find container \"bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b\": container with ID starting with bb5e62ad4bd255387d738663d3f6dc8923bd0da4fb5f4d5278ff58987451b86b not found: ID does not exist" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.441875 4744 scope.go:117] "RemoveContainer" containerID="441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111" Mar 11 02:18:16 crc kubenswrapper[4744]: E0311 02:18:16.444974 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111\": container with ID starting with 441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111 not found: ID does not exist" containerID="441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.445026 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111"} err="failed to get container status \"441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111\": rpc error: code = NotFound desc = could not find container \"441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111\": container with ID starting with 441c557a948b7db640c405e67ff45aa288cc063de82f23528b1b553857a8f111 not found: ID does not exist" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.449148 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m" (OuterVolumeSpecName: "kube-api-access-rwm2m") pod "defdeafa-942d-4432-bb2c-c8e1176ce936" (UID: "defdeafa-942d-4432-bb2c-c8e1176ce936"). InnerVolumeSpecName "kube-api-access-rwm2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.474132 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "defdeafa-942d-4432-bb2c-c8e1176ce936" (UID: "defdeafa-942d-4432-bb2c-c8e1176ce936"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.481596 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config" (OuterVolumeSpecName: "config") pod "defdeafa-942d-4432-bb2c-c8e1176ce936" (UID: "defdeafa-942d-4432-bb2c-c8e1176ce936"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.543120 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.543157 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defdeafa-942d-4432-bb2c-c8e1176ce936-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.543173 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwm2m\" (UniqueName: \"kubernetes.io/projected/defdeafa-942d-4432-bb2c-c8e1176ce936-kube-api-access-rwm2m\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.741255 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.751703 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-nm4xr"] Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.975080 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 11 02:18:16 crc kubenswrapper[4744]: I0311 02:18:16.975417 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 11 02:18:17 crc kubenswrapper[4744]: I0311 02:18:17.990204 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" path="/var/lib/kubelet/pods/defdeafa-942d-4432-bb2c-c8e1176ce936/volumes" Mar 11 02:18:18 crc kubenswrapper[4744]: I0311 02:18:18.518402 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:18 crc kubenswrapper[4744]: I0311 02:18:18.518503 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:19 crc kubenswrapper[4744]: I0311 02:18:19.451620 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 11 02:18:19 crc kubenswrapper[4744]: I0311 02:18:19.521700 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 11 02:18:21 crc kubenswrapper[4744]: I0311 02:18:21.011393 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:21 crc kubenswrapper[4744]: I0311 02:18:21.125999 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 11 02:18:21 crc kubenswrapper[4744]: I0311 02:18:21.975435 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:18:21 crc kubenswrapper[4744]: E0311 02:18:21.976149 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.631288 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kkcqr"] Mar 11 02:18:25 crc kubenswrapper[4744]: E0311 02:18:25.632028 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="dnsmasq-dns" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.632051 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="dnsmasq-dns" Mar 11 02:18:25 crc kubenswrapper[4744]: E0311 02:18:25.632081 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="init" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.632093 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="init" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.632402 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="defdeafa-942d-4432-bb2c-c8e1176ce936" containerName="dnsmasq-dns" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.633157 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.637368 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.645487 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kkcqr"] Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.716067 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpvb\" (UniqueName: \"kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.716143 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.818024 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpvb\" (UniqueName: \"kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.818094 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:25 crc kubenswrapper[4744]: I0311 02:18:25.819313 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:26 crc kubenswrapper[4744]: I0311 02:18:26.076716 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpvb\" (UniqueName: \"kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb\") pod \"root-account-create-update-kkcqr\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:26 crc kubenswrapper[4744]: I0311 02:18:26.270481 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:26 crc kubenswrapper[4744]: I0311 02:18:26.636401 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kkcqr"] Mar 11 02:18:27 crc kubenswrapper[4744]: I0311 02:18:27.492564 4744 generic.go:334] "Generic (PLEG): container finished" podID="479acbff-2db0-4745-975c-be4dcddd3911" containerID="5770b245a42de80a839f71f12517e543cd28cf787dba4568bd88ef87c3f896d8" exitCode=0 Mar 11 02:18:27 crc kubenswrapper[4744]: I0311 02:18:27.492647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kkcqr" event={"ID":"479acbff-2db0-4745-975c-be4dcddd3911","Type":"ContainerDied","Data":"5770b245a42de80a839f71f12517e543cd28cf787dba4568bd88ef87c3f896d8"} Mar 11 02:18:27 crc kubenswrapper[4744]: I0311 02:18:27.492695 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kkcqr" event={"ID":"479acbff-2db0-4745-975c-be4dcddd3911","Type":"ContainerStarted","Data":"71d57317b659e882207aac7c7860af1b78de9c06c8d7e07b74ed3c2253e448ba"} Mar 11 02:18:28 crc kubenswrapper[4744]: I0311 02:18:28.916912 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:28 crc kubenswrapper[4744]: I0311 02:18:28.971481 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts\") pod \"479acbff-2db0-4745-975c-be4dcddd3911\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " Mar 11 02:18:28 crc kubenswrapper[4744]: I0311 02:18:28.971960 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lpvb\" (UniqueName: \"kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb\") pod \"479acbff-2db0-4745-975c-be4dcddd3911\" (UID: \"479acbff-2db0-4745-975c-be4dcddd3911\") " Mar 11 02:18:28 crc kubenswrapper[4744]: I0311 02:18:28.972677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "479acbff-2db0-4745-975c-be4dcddd3911" (UID: "479acbff-2db0-4745-975c-be4dcddd3911"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:28 crc kubenswrapper[4744]: I0311 02:18:28.977682 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb" (OuterVolumeSpecName: "kube-api-access-4lpvb") pod "479acbff-2db0-4745-975c-be4dcddd3911" (UID: "479acbff-2db0-4745-975c-be4dcddd3911"). InnerVolumeSpecName "kube-api-access-4lpvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:29 crc kubenswrapper[4744]: I0311 02:18:29.073849 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479acbff-2db0-4745-975c-be4dcddd3911-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:29 crc kubenswrapper[4744]: I0311 02:18:29.073903 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lpvb\" (UniqueName: \"kubernetes.io/projected/479acbff-2db0-4745-975c-be4dcddd3911-kube-api-access-4lpvb\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:29 crc kubenswrapper[4744]: I0311 02:18:29.512901 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kkcqr" event={"ID":"479acbff-2db0-4745-975c-be4dcddd3911","Type":"ContainerDied","Data":"71d57317b659e882207aac7c7860af1b78de9c06c8d7e07b74ed3c2253e448ba"} Mar 11 02:18:29 crc kubenswrapper[4744]: I0311 02:18:29.512963 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d57317b659e882207aac7c7860af1b78de9c06c8d7e07b74ed3c2253e448ba" Mar 11 02:18:29 crc kubenswrapper[4744]: I0311 02:18:29.512984 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kkcqr" Mar 11 02:18:32 crc kubenswrapper[4744]: I0311 02:18:32.146417 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kkcqr"] Mar 11 02:18:32 crc kubenswrapper[4744]: I0311 02:18:32.154984 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kkcqr"] Mar 11 02:18:33 crc kubenswrapper[4744]: I0311 02:18:33.989594 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479acbff-2db0-4745-975c-be4dcddd3911" path="/var/lib/kubelet/pods/479acbff-2db0-4745-975c-be4dcddd3911/volumes" Mar 11 02:18:35 crc kubenswrapper[4744]: I0311 02:18:35.975215 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:18:35 crc kubenswrapper[4744]: E0311 02:18:35.975946 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.141281 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mpb95"] Mar 11 02:18:37 crc kubenswrapper[4744]: E0311 02:18:37.141655 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479acbff-2db0-4745-975c-be4dcddd3911" containerName="mariadb-account-create-update" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.141668 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="479acbff-2db0-4745-975c-be4dcddd3911" containerName="mariadb-account-create-update" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.141851 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="479acbff-2db0-4745-975c-be4dcddd3911" containerName="mariadb-account-create-update" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.142402 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.144317 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.153741 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mpb95"] Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.222906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhp9s\" (UniqueName: \"kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.223017 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.324259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhp9s\" (UniqueName: \"kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.324333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.325564 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.355175 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhp9s\" (UniqueName: \"kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s\") pod \"root-account-create-update-mpb95\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.470819 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:37 crc kubenswrapper[4744]: I0311 02:18:37.922847 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mpb95"] Mar 11 02:18:38 crc kubenswrapper[4744]: I0311 02:18:38.591851 4744 generic.go:334] "Generic (PLEG): container finished" podID="b5bbcff7-a221-4aeb-8859-da7a80f83ab7" containerID="5e1d2b9bc037730517f6b7f6b2300207d7a7f44cf7bd7d17f878b615b65139e6" exitCode=0 Mar 11 02:18:38 crc kubenswrapper[4744]: I0311 02:18:38.592383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mpb95" event={"ID":"b5bbcff7-a221-4aeb-8859-da7a80f83ab7","Type":"ContainerDied","Data":"5e1d2b9bc037730517f6b7f6b2300207d7a7f44cf7bd7d17f878b615b65139e6"} Mar 11 02:18:38 crc kubenswrapper[4744]: I0311 02:18:38.592407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mpb95" event={"ID":"b5bbcff7-a221-4aeb-8859-da7a80f83ab7","Type":"ContainerStarted","Data":"0f5d84224ebbc8c7168b78ececd28c0522175bb012e15d4a889aa5aa09a20565"} Mar 11 02:18:38 crc kubenswrapper[4744]: E0311 02:18:38.595273 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bbcff7_a221_4aeb_8859_da7a80f83ab7.slice/crio-5e1d2b9bc037730517f6b7f6b2300207d7a7f44cf7bd7d17f878b615b65139e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bbcff7_a221_4aeb_8859_da7a80f83ab7.slice/crio-conmon-5e1d2b9bc037730517f6b7f6b2300207d7a7f44cf7bd7d17f878b615b65139e6.scope\": RecentStats: unable to find data in memory cache]" Mar 11 02:18:39 crc kubenswrapper[4744]: I0311 02:18:39.615819 4744 generic.go:334] "Generic (PLEG): container finished" podID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerID="947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434" exitCode=0 Mar 11 02:18:39 crc kubenswrapper[4744]: I0311 02:18:39.615933 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerDied","Data":"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434"} Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.080338 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.187251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhp9s\" (UniqueName: \"kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s\") pod \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.187538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts\") pod \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\" (UID: \"b5bbcff7-a221-4aeb-8859-da7a80f83ab7\") " Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.188054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5bbcff7-a221-4aeb-8859-da7a80f83ab7" (UID: "b5bbcff7-a221-4aeb-8859-da7a80f83ab7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.196844 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s" (OuterVolumeSpecName: "kube-api-access-zhp9s") pod "b5bbcff7-a221-4aeb-8859-da7a80f83ab7" (UID: "b5bbcff7-a221-4aeb-8859-da7a80f83ab7"). InnerVolumeSpecName "kube-api-access-zhp9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.289412 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.289749 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhp9s\" (UniqueName: \"kubernetes.io/projected/b5bbcff7-a221-4aeb-8859-da7a80f83ab7-kube-api-access-zhp9s\") on node \"crc\" DevicePath \"\"" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.630045 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mpb95" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.630055 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mpb95" event={"ID":"b5bbcff7-a221-4aeb-8859-da7a80f83ab7","Type":"ContainerDied","Data":"0f5d84224ebbc8c7168b78ececd28c0522175bb012e15d4a889aa5aa09a20565"} Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.631565 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f5d84224ebbc8c7168b78ececd28c0522175bb012e15d4a889aa5aa09a20565" Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.633953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerStarted","Data":"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90"} Mar 11 02:18:40 crc kubenswrapper[4744]: I0311 02:18:40.634254 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:41 crc kubenswrapper[4744]: I0311 02:18:41.118881 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.118857194 podStartE2EDuration="37.118857194s" podCreationTimestamp="2026-03-11 02:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:40.711053418 +0000 UTC m=+5077.515271023" watchObservedRunningTime="2026-03-11 02:18:41.118857194 +0000 UTC m=+5077.923074829" Mar 11 02:18:41 crc kubenswrapper[4744]: I0311 02:18:41.647085 4744 generic.go:334] "Generic (PLEG): container finished" podID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerID="8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca" exitCode=0 Mar 11 02:18:41 crc kubenswrapper[4744]: I0311 02:18:41.647197 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerDied","Data":"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca"} Mar 11 02:18:42 crc kubenswrapper[4744]: I0311 02:18:42.658569 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerStarted","Data":"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab"} Mar 11 02:18:42 crc kubenswrapper[4744]: I0311 02:18:42.659110 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 02:18:42 crc kubenswrapper[4744]: I0311 02:18:42.697552 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.697498426 podStartE2EDuration="38.697498426s" podCreationTimestamp="2026-03-11 02:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:18:42.688334903 +0000 UTC m=+5079.492552508" watchObservedRunningTime="2026-03-11 02:18:42.697498426 +0000 UTC m=+5079.501716071" Mar 11 02:18:50 crc kubenswrapper[4744]: I0311 02:18:50.974885 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:18:50 crc kubenswrapper[4744]: E0311 02:18:50.975705 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:18:55 crc kubenswrapper[4744]: I0311 02:18:55.919771 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:18:56 crc kubenswrapper[4744]: I0311 02:18:56.634735 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.510882 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:19:00 crc kubenswrapper[4744]: E0311 02:19:00.511477 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bbcff7-a221-4aeb-8859-da7a80f83ab7" containerName="mariadb-account-create-update" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.511492 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bbcff7-a221-4aeb-8859-da7a80f83ab7" containerName="mariadb-account-create-update" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.511812 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bbcff7-a221-4aeb-8859-da7a80f83ab7" containerName="mariadb-account-create-update" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.515975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.518378 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.645053 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs42w\" (UniqueName: \"kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.645119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.645254 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.747122 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs42w\" (UniqueName: \"kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.747194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.747327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.748889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.750361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.769652 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs42w\" (UniqueName: \"kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w\") pod \"dnsmasq-dns-66d5bf7c87-2x58n\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:00 crc kubenswrapper[4744]: I0311 02:19:00.835664 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:01 crc kubenswrapper[4744]: I0311 02:19:01.102909 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:19:01 crc kubenswrapper[4744]: I0311 02:19:01.431075 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:01 crc kubenswrapper[4744]: I0311 02:19:01.820341 4744 generic.go:334] "Generic (PLEG): container finished" podID="a3b50521-42df-4429-92be-19652912970d" containerID="df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580" exitCode=0 Mar 11 02:19:01 crc kubenswrapper[4744]: I0311 02:19:01.820571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" event={"ID":"a3b50521-42df-4429-92be-19652912970d","Type":"ContainerDied","Data":"df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580"} Mar 11 02:19:01 crc kubenswrapper[4744]: I0311 02:19:01.821321 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" event={"ID":"a3b50521-42df-4429-92be-19652912970d","Type":"ContainerStarted","Data":"ef23250f4220ab251351d05a7ec82ada9f7c9cb05ff697bb646b2c5effa2b5df"} Mar 11 02:19:02 crc kubenswrapper[4744]: I0311 02:19:02.358011 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:02 crc kubenswrapper[4744]: I0311 02:19:02.834495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" event={"ID":"a3b50521-42df-4429-92be-19652912970d","Type":"ContainerStarted","Data":"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf"} Mar 11 02:19:02 crc kubenswrapper[4744]: I0311 02:19:02.834919 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:02 crc kubenswrapper[4744]: I0311 02:19:02.867959 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" podStartSLOduration=2.867930814 podStartE2EDuration="2.867930814s" podCreationTimestamp="2026-03-11 02:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:19:02.866874332 +0000 UTC m=+5099.671091967" watchObservedRunningTime="2026-03-11 02:19:02.867930814 +0000 UTC m=+5099.672148459" Mar 11 02:19:03 crc kubenswrapper[4744]: I0311 02:19:03.981024 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:19:03 crc kubenswrapper[4744]: E0311 02:19:03.981849 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:19:06 crc kubenswrapper[4744]: I0311 02:19:06.038273 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="rabbitmq" containerID="cri-o://1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab" gracePeriod=604796 Mar 11 02:19:06 crc kubenswrapper[4744]: I0311 02:19:06.632186 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.27:5671: connect: connection refused" Mar 11 02:19:06 crc kubenswrapper[4744]: I0311 02:19:06.854306 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="rabbitmq" containerID="cri-o://cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90" gracePeriod=604796 Mar 11 02:19:10 crc kubenswrapper[4744]: I0311 02:19:10.837908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:19:10 crc kubenswrapper[4744]: I0311 02:19:10.915489 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:19:10 crc kubenswrapper[4744]: I0311 02:19:10.915815 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="dnsmasq-dns" containerID="cri-o://00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222" gracePeriod=10 Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.329040 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.467596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config\") pod \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.467698 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc\") pod \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.467782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctjms\" (UniqueName: \"kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms\") pod \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\" (UID: \"8b1f9328-7210-412e-9d4d-1d1a8b6804dc\") " Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.482196 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms" (OuterVolumeSpecName: "kube-api-access-ctjms") pod "8b1f9328-7210-412e-9d4d-1d1a8b6804dc" (UID: "8b1f9328-7210-412e-9d4d-1d1a8b6804dc"). InnerVolumeSpecName "kube-api-access-ctjms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.498475 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b1f9328-7210-412e-9d4d-1d1a8b6804dc" (UID: "8b1f9328-7210-412e-9d4d-1d1a8b6804dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.516399 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config" (OuterVolumeSpecName: "config") pod "8b1f9328-7210-412e-9d4d-1d1a8b6804dc" (UID: "8b1f9328-7210-412e-9d4d-1d1a8b6804dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.569417 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.569650 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.569737 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctjms\" (UniqueName: \"kubernetes.io/projected/8b1f9328-7210-412e-9d4d-1d1a8b6804dc-kube-api-access-ctjms\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.932818 4744 generic.go:334] "Generic (PLEG): container finished" podID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerID="00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222" exitCode=0 Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.932902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" event={"ID":"8b1f9328-7210-412e-9d4d-1d1a8b6804dc","Type":"ContainerDied","Data":"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222"} Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.932946 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.932992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-xqrvm" event={"ID":"8b1f9328-7210-412e-9d4d-1d1a8b6804dc","Type":"ContainerDied","Data":"7e3311e9d9687fdc7517741101acc8e92512d0e044900204870cb3d0a7b1d43d"} Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.933033 4744 scope.go:117] "RemoveContainer" containerID="00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.964903 4744 scope.go:117] "RemoveContainer" containerID="6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c" Mar 11 02:19:11 crc kubenswrapper[4744]: I0311 02:19:11.998074 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.001357 4744 scope.go:117] "RemoveContainer" containerID="00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222" Mar 11 02:19:12 crc kubenswrapper[4744]: E0311 02:19:12.001975 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222\": container with ID starting with 00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222 not found: ID does not exist" containerID="00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.002043 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222"} err="failed to get container status \"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222\": rpc error: code = NotFound desc = could not find container \"00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222\": container with ID starting with 00c4fb01a5bb187a5e67e9974147771760103b19c876ca5ac454d10c43e33222 not found: ID does not exist" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.002089 4744 scope.go:117] "RemoveContainer" containerID="6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c" Mar 11 02:19:12 crc kubenswrapper[4744]: E0311 02:19:12.002885 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c\": container with ID starting with 6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c not found: ID does not exist" containerID="6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.002934 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c"} err="failed to get container status \"6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c\": rpc error: code = NotFound desc = could not find container \"6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c\": container with ID starting with 6b16695aae796d9c4e152a789dc0b8d1b9d928ae7d1d14a3228aeb2096998e2c not found: ID does not exist" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.004916 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-xqrvm"] Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.691563 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798123 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798204 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798247 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798277 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798428 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798494 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.798537 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdhbc\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc\") pod \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\" (UID: \"f6b347eb-1bcc-4fa4-96c4-c15523778e9c\") " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.799062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.806235 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.806798 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.807095 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.818089 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.822717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc" (OuterVolumeSpecName: "kube-api-access-vdhbc") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "kube-api-access-vdhbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.826019 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8" (OuterVolumeSpecName: "persistence") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "pvc-2cf35867-ca5d-4038-9115-b41f33433bb8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.826838 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info" (OuterVolumeSpecName: "pod-info") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.853250 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data" (OuterVolumeSpecName: "config-data") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.884728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf" (OuterVolumeSpecName: "server-conf") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899834 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899875 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899925 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") on node \"crc\" " Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899949 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899965 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899980 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.899995 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdhbc\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-kube-api-access-vdhbc\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.900009 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.900022 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.900036 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.921499 4744 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.921698 4744 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2cf35867-ca5d-4038-9115-b41f33433bb8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8") on node "crc" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.936012 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f6b347eb-1bcc-4fa4-96c4-c15523778e9c" (UID: "f6b347eb-1bcc-4fa4-96c4-c15523778e9c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.940368 4744 generic.go:334] "Generic (PLEG): container finished" podID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerID="1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab" exitCode=0 Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.940421 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerDied","Data":"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab"} Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.940462 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6b347eb-1bcc-4fa4-96c4-c15523778e9c","Type":"ContainerDied","Data":"7826fcb383701dd15de9f51444f013455c812635534265e63dc2fb9e43d27c8d"} Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.940466 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.940484 4744 scope.go:117] "RemoveContainer" containerID="1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.961361 4744 scope.go:117] "RemoveContainer" containerID="8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca" Mar 11 02:19:12 crc kubenswrapper[4744]: I0311 02:19:12.985058 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:12.999261 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.002814 4744 scope.go:117] "RemoveContainer" containerID="1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab" Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.003550 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab\": container with ID starting with 1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab not found: ID does not exist" containerID="1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.003596 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab"} err="failed to get container status \"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab\": rpc error: code = NotFound desc = could not find container \"1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab\": container with ID starting with 1a5cc361402c37f3dd2e605ad67444168ee2dc9662c5d7469ea009f0cba9bdab not found: ID does not exist" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.003632 4744 scope.go:117] "RemoveContainer" containerID="8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca" Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.004044 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca\": container with ID starting with 8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca not found: ID does not exist" containerID="8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.004089 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca"} err="failed to get container status \"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca\": rpc error: code = NotFound desc = could not find container \"8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca\": container with ID starting with 8cd19c2c7ffc99236ff0804a1d3ac5e9b65025c622357a50b91340eb2a661fca not found: ID does not exist" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005363 4744 reconciler_common.go:293] "Volume detached for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005396 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6b347eb-1bcc-4fa4-96c4-c15523778e9c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005582 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.005827 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="init" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005843 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="init" Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.005856 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="setup-container" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005862 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="setup-container" Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.005876 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="dnsmasq-dns" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005881 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="dnsmasq-dns" Mar 11 02:19:13 crc kubenswrapper[4744]: E0311 02:19:13.005899 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="rabbitmq" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.005904 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="rabbitmq" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.006045 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" containerName="dnsmasq-dns" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.006069 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" containerName="rabbitmq" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.006796 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.010254 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.010382 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.010721 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.010856 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.010959 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.011242 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9xkfn" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.011357 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.021020 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/876a9769-a63b-46e0-961b-25b726ba177d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106830 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106860 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmg9\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-kube-api-access-ttmg9\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106910 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/876a9769-a63b-46e0-961b-25b726ba177d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106948 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106970 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.106994 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-config-data\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.107012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.107028 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.107091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208188 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/876a9769-a63b-46e0-961b-25b726ba177d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208299 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-config-data\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208379 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208412 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/876a9769-a63b-46e0-961b-25b726ba177d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208455 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmg9\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-kube-api-access-ttmg9\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.208656 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.209387 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.211044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.211468 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-config-data\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.212412 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/876a9769-a63b-46e0-961b-25b726ba177d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.212587 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.212610 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b75736b3331f603a9da86ac6aa87ca334c35162c4bfbab228d172c616bad6b4f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.213803 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/876a9769-a63b-46e0-961b-25b726ba177d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.214014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.217777 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.218281 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/876a9769-a63b-46e0-961b-25b726ba177d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.228923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmg9\" (UniqueName: \"kubernetes.io/projected/876a9769-a63b-46e0-961b-25b726ba177d-kube-api-access-ttmg9\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.252537 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cf35867-ca5d-4038-9115-b41f33433bb8\") pod \"rabbitmq-server-0\" (UID: \"876a9769-a63b-46e0-961b-25b726ba177d\") " pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.420099 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.422455 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513122 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513620 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxnz8\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513876 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513931 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.513967 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.514038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.514099 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.514168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.514224 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data\") pod \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\" (UID: \"c54e9a99-5c2c-48df-a5c0-75fb8727a328\") " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.515468 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.516630 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.517258 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.521375 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.521462 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info" (OuterVolumeSpecName: "pod-info") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.536109 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8" (OuterVolumeSpecName: "kube-api-access-kxnz8") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "kube-api-access-kxnz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.542160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.547246 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7" (OuterVolumeSpecName: "persistence") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.564009 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data" (OuterVolumeSpecName: "config-data") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.594250 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf" (OuterVolumeSpecName: "server-conf") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616219 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxnz8\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-kube-api-access-kxnz8\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616250 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c54e9a99-5c2c-48df-a5c0-75fb8727a328-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616263 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616277 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616289 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616300 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c54e9a99-5c2c-48df-a5c0-75fb8727a328-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616311 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616322 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616357 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") on node \"crc\" " Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.616371 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c54e9a99-5c2c-48df-a5c0-75fb8727a328-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.649498 4744 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.649752 4744 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7") on node "crc" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.678181 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c54e9a99-5c2c-48df-a5c0-75fb8727a328" (UID: "c54e9a99-5c2c-48df-a5c0-75fb8727a328"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.717257 4744 reconciler_common.go:293] "Volume detached for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.717295 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c54e9a99-5c2c-48df-a5c0-75fb8727a328-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.927357 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 02:19:13 crc kubenswrapper[4744]: W0311 02:19:13.931169 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod876a9769_a63b_46e0_961b_25b726ba177d.slice/crio-3ae81a19de84c15403dc7f1c8b8da8672a0438d1a75b8c733a9848f51c7d7672 WatchSource:0}: Error finding container 3ae81a19de84c15403dc7f1c8b8da8672a0438d1a75b8c733a9848f51c7d7672: Status 404 returned error can't find the container with id 3ae81a19de84c15403dc7f1c8b8da8672a0438d1a75b8c733a9848f51c7d7672 Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.956264 4744 generic.go:334] "Generic (PLEG): container finished" podID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerID="cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90" exitCode=0 Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.956334 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerDied","Data":"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90"} Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.956401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c54e9a99-5c2c-48df-a5c0-75fb8727a328","Type":"ContainerDied","Data":"bca684a2b7d25d39eca2a9206a406028bf0332bff0b1e3b8f6ea882999ee9993"} Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.956405 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.956435 4744 scope.go:117] "RemoveContainer" containerID="cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.961143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"876a9769-a63b-46e0-961b-25b726ba177d","Type":"ContainerStarted","Data":"3ae81a19de84c15403dc7f1c8b8da8672a0438d1a75b8c733a9848f51c7d7672"} Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.982377 4744 scope.go:117] "RemoveContainer" containerID="947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.993471 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1f9328-7210-412e-9d4d-1d1a8b6804dc" path="/var/lib/kubelet/pods/8b1f9328-7210-412e-9d4d-1d1a8b6804dc/volumes" Mar 11 02:19:13 crc kubenswrapper[4744]: I0311 02:19:13.995485 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b347eb-1bcc-4fa4-96c4-c15523778e9c" path="/var/lib/kubelet/pods/f6b347eb-1bcc-4fa4-96c4-c15523778e9c/volumes" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.007251 4744 scope.go:117] "RemoveContainer" containerID="cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90" Mar 11 02:19:14 crc kubenswrapper[4744]: E0311 02:19:14.007833 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90\": container with ID starting with cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90 not found: ID does not exist" containerID="cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.007882 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90"} err="failed to get container status \"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90\": rpc error: code = NotFound desc = could not find container \"cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90\": container with ID starting with cf0df1ce297bb3fc5a3654008ea28134277c57ad3cafbbd427f9b09c55d92a90 not found: ID does not exist" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.007915 4744 scope.go:117] "RemoveContainer" containerID="947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434" Mar 11 02:19:14 crc kubenswrapper[4744]: E0311 02:19:14.008402 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434\": container with ID starting with 947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434 not found: ID does not exist" containerID="947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.008435 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434"} err="failed to get container status \"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434\": rpc error: code = NotFound desc = could not find container \"947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434\": container with ID starting with 947621d45cfb896f5b5cc9e379963298c64e953b5dfb03f7e69683336842f434 not found: ID does not exist" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.037421 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.051925 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.060549 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:14 crc kubenswrapper[4744]: E0311 02:19:14.061450 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="setup-container" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.061677 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="setup-container" Mar 11 02:19:14 crc kubenswrapper[4744]: E0311 02:19:14.061861 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="rabbitmq" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.062005 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="rabbitmq" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.062446 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" containerName="rabbitmq" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.063926 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.068775 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.068919 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t9pqx" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.069845 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.070337 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.070492 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.070926 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.075979 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.080377 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.225982 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226097 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d74643b-a5ad-4129-a109-0d49f957b306-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226145 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d74643b-a5ad-4129-a109-0d49f957b306-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226270 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226320 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226396 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226551 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26z9q\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-kube-api-access-26z9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226615 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.226710 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.327916 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26z9q\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-kube-api-access-26z9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328227 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d74643b-a5ad-4129-a109-0d49f957b306-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328381 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328417 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d74643b-a5ad-4129-a109-0d49f957b306-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.328470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.329251 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.329752 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.329767 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.331329 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.331417 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d74643b-a5ad-4129-a109-0d49f957b306-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.334657 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.334706 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5b63ef380139cc0902f31bd0b7ac4425ff7a05031933db8e86fe68bd3f42ee7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.339901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d74643b-a5ad-4129-a109-0d49f957b306-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.340004 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.340401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.341162 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d74643b-a5ad-4129-a109-0d49f957b306-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.360606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26z9q\" (UniqueName: \"kubernetes.io/projected/8d74643b-a5ad-4129-a109-0d49f957b306-kube-api-access-26z9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.387973 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1cc55ee-0f9e-40f7-a7ff-6a53cac2e2c7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d74643b-a5ad-4129-a109-0d49f957b306\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.429869 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.898399 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 02:19:14 crc kubenswrapper[4744]: W0311 02:19:14.908998 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d74643b_a5ad_4129_a109_0d49f957b306.slice/crio-c918bfe160e087d925c7a199d89758257b50373a577be562ad8ed03a4a8ed28e WatchSource:0}: Error finding container c918bfe160e087d925c7a199d89758257b50373a577be562ad8ed03a4a8ed28e: Status 404 returned error can't find the container with id c918bfe160e087d925c7a199d89758257b50373a577be562ad8ed03a4a8ed28e Mar 11 02:19:14 crc kubenswrapper[4744]: I0311 02:19:14.975701 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d74643b-a5ad-4129-a109-0d49f957b306","Type":"ContainerStarted","Data":"c918bfe160e087d925c7a199d89758257b50373a577be562ad8ed03a4a8ed28e"} Mar 11 02:19:15 crc kubenswrapper[4744]: I0311 02:19:15.987197 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54e9a99-5c2c-48df-a5c0-75fb8727a328" path="/var/lib/kubelet/pods/c54e9a99-5c2c-48df-a5c0-75fb8727a328/volumes" Mar 11 02:19:16 crc kubenswrapper[4744]: I0311 02:19:16.976937 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:19:16 crc kubenswrapper[4744]: E0311 02:19:16.977698 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:19:16 crc kubenswrapper[4744]: I0311 02:19:16.998766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d74643b-a5ad-4129-a109-0d49f957b306","Type":"ContainerStarted","Data":"73c17a468a61a43f7410dd2d5f95222208f582d10ca3507189a6391ac3baf07e"} Mar 11 02:19:17 crc kubenswrapper[4744]: I0311 02:19:17.001671 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"876a9769-a63b-46e0-961b-25b726ba177d","Type":"ContainerStarted","Data":"61db4c39e39920ee48fff65710dfbb3287ee81a4ec465070a564ab6917834bf8"} Mar 11 02:19:29 crc kubenswrapper[4744]: I0311 02:19:29.975745 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:19:29 crc kubenswrapper[4744]: E0311 02:19:29.976748 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:19:42 crc kubenswrapper[4744]: I0311 02:19:42.974975 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:19:42 crc kubenswrapper[4744]: E0311 02:19:42.976159 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:19:50 crc kubenswrapper[4744]: I0311 02:19:50.315006 4744 generic.go:334] "Generic (PLEG): container finished" podID="876a9769-a63b-46e0-961b-25b726ba177d" containerID="61db4c39e39920ee48fff65710dfbb3287ee81a4ec465070a564ab6917834bf8" exitCode=0 Mar 11 02:19:50 crc kubenswrapper[4744]: I0311 02:19:50.315118 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"876a9769-a63b-46e0-961b-25b726ba177d","Type":"ContainerDied","Data":"61db4c39e39920ee48fff65710dfbb3287ee81a4ec465070a564ab6917834bf8"} Mar 11 02:19:50 crc kubenswrapper[4744]: I0311 02:19:50.319408 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d74643b-a5ad-4129-a109-0d49f957b306" containerID="73c17a468a61a43f7410dd2d5f95222208f582d10ca3507189a6391ac3baf07e" exitCode=0 Mar 11 02:19:50 crc kubenswrapper[4744]: I0311 02:19:50.319473 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d74643b-a5ad-4129-a109-0d49f957b306","Type":"ContainerDied","Data":"73c17a468a61a43f7410dd2d5f95222208f582d10ca3507189a6391ac3baf07e"} Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.327799 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"876a9769-a63b-46e0-961b-25b726ba177d","Type":"ContainerStarted","Data":"7e7f8d224f3d897fdcd32c62bc230a21731a03a4d64a81685a39a9d88ac4c57b"} Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.328535 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.329883 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d74643b-a5ad-4129-a109-0d49f957b306","Type":"ContainerStarted","Data":"d84f4f276f3178d969ebe861327253637b33eaf74a3d69c3fa655c73eb25f503"} Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.330148 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.353734 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.353716981 podStartE2EDuration="39.353716981s" podCreationTimestamp="2026-03-11 02:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:19:51.346946293 +0000 UTC m=+5148.151163898" watchObservedRunningTime="2026-03-11 02:19:51.353716981 +0000 UTC m=+5148.157934586" Mar 11 02:19:51 crc kubenswrapper[4744]: I0311 02:19:51.372485 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.372470209 podStartE2EDuration="37.372470209s" podCreationTimestamp="2026-03-11 02:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:19:51.368411855 +0000 UTC m=+5148.172629490" watchObservedRunningTime="2026-03-11 02:19:51.372470209 +0000 UTC m=+5148.176687804" Mar 11 02:19:57 crc kubenswrapper[4744]: I0311 02:19:57.975498 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:19:57 crc kubenswrapper[4744]: E0311 02:19:57.976451 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.166459 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553260-5m7db"] Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.168202 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.173429 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.173698 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.173715 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.178438 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553260-5m7db"] Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.266816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfbht\" (UniqueName: \"kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht\") pod \"auto-csr-approver-29553260-5m7db\" (UID: \"f04ce991-8347-48a6-84a7-619f7406c110\") " pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.368719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfbht\" (UniqueName: \"kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht\") pod \"auto-csr-approver-29553260-5m7db\" (UID: \"f04ce991-8347-48a6-84a7-619f7406c110\") " pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.415414 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfbht\" (UniqueName: \"kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht\") pod \"auto-csr-approver-29553260-5m7db\" (UID: \"f04ce991-8347-48a6-84a7-619f7406c110\") " pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:00 crc kubenswrapper[4744]: I0311 02:20:00.519994 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:01 crc kubenswrapper[4744]: I0311 02:20:01.013737 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553260-5m7db"] Mar 11 02:20:01 crc kubenswrapper[4744]: I0311 02:20:01.022096 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:20:01 crc kubenswrapper[4744]: I0311 02:20:01.414808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553260-5m7db" event={"ID":"f04ce991-8347-48a6-84a7-619f7406c110","Type":"ContainerStarted","Data":"63aa9ff2f3b7a7af04994230ebec90de7a8c19b9f70e1c52737ce3e72b74f1c9"} Mar 11 02:20:03 crc kubenswrapper[4744]: I0311 02:20:03.428760 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 02:20:03 crc kubenswrapper[4744]: I0311 02:20:03.438581 4744 generic.go:334] "Generic (PLEG): container finished" podID="f04ce991-8347-48a6-84a7-619f7406c110" containerID="692f4e0f9c3819a33b328d73569888269a69e6a812161c921ef2c00811a5ca28" exitCode=0 Mar 11 02:20:03 crc kubenswrapper[4744]: I0311 02:20:03.438742 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553260-5m7db" event={"ID":"f04ce991-8347-48a6-84a7-619f7406c110","Type":"ContainerDied","Data":"692f4e0f9c3819a33b328d73569888269a69e6a812161c921ef2c00811a5ca28"} Mar 11 02:20:04 crc kubenswrapper[4744]: I0311 02:20:04.432698 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 02:20:04 crc kubenswrapper[4744]: I0311 02:20:04.774182 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:04 crc kubenswrapper[4744]: I0311 02:20:04.848732 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfbht\" (UniqueName: \"kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht\") pod \"f04ce991-8347-48a6-84a7-619f7406c110\" (UID: \"f04ce991-8347-48a6-84a7-619f7406c110\") " Mar 11 02:20:04 crc kubenswrapper[4744]: I0311 02:20:04.856681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht" (OuterVolumeSpecName: "kube-api-access-hfbht") pod "f04ce991-8347-48a6-84a7-619f7406c110" (UID: "f04ce991-8347-48a6-84a7-619f7406c110"). InnerVolumeSpecName "kube-api-access-hfbht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:20:04 crc kubenswrapper[4744]: I0311 02:20:04.950240 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfbht\" (UniqueName: \"kubernetes.io/projected/f04ce991-8347-48a6-84a7-619f7406c110-kube-api-access-hfbht\") on node \"crc\" DevicePath \"\"" Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.459804 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553260-5m7db" event={"ID":"f04ce991-8347-48a6-84a7-619f7406c110","Type":"ContainerDied","Data":"63aa9ff2f3b7a7af04994230ebec90de7a8c19b9f70e1c52737ce3e72b74f1c9"} Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.459897 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63aa9ff2f3b7a7af04994230ebec90de7a8c19b9f70e1c52737ce3e72b74f1c9" Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.459869 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553260-5m7db" Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.860149 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553254-w8rth"] Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.871310 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553254-w8rth"] Mar 11 02:20:05 crc kubenswrapper[4744]: I0311 02:20:05.991146 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8bcff3-14f9-4749-bf02-147262a8384c" path="/var/lib/kubelet/pods/ff8bcff3-14f9-4749-bf02-147262a8384c/volumes" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.585206 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:07 crc kubenswrapper[4744]: E0311 02:20:07.585548 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04ce991-8347-48a6-84a7-619f7406c110" containerName="oc" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.585564 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04ce991-8347-48a6-84a7-619f7406c110" containerName="oc" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.585743 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04ce991-8347-48a6-84a7-619f7406c110" containerName="oc" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.586318 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.590085 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cz2v8" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.605929 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.696769 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k\") pod \"mariadb-client\" (UID: \"fc3b8462-612d-4feb-b961-565966784043\") " pod="openstack/mariadb-client" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.798999 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k\") pod \"mariadb-client\" (UID: \"fc3b8462-612d-4feb-b961-565966784043\") " pod="openstack/mariadb-client" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.830329 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k\") pod \"mariadb-client\" (UID: \"fc3b8462-612d-4feb-b961-565966784043\") " pod="openstack/mariadb-client" Mar 11 02:20:07 crc kubenswrapper[4744]: I0311 02:20:07.907142 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:20:08 crc kubenswrapper[4744]: I0311 02:20:08.116658 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a6e19d5d-20f5-4836-afcc-a5958a01bbf2" containerName="galera" probeResult="failure" output="command timed out" Mar 11 02:20:08 crc kubenswrapper[4744]: I0311 02:20:08.391726 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:08 crc kubenswrapper[4744]: I0311 02:20:08.484220 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fc3b8462-612d-4feb-b961-565966784043","Type":"ContainerStarted","Data":"05f5fd992812203b308a2ffdce4de302ee16c735c5fea730283286edb47c29e8"} Mar 11 02:20:11 crc kubenswrapper[4744]: I0311 02:20:11.974596 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:20:11 crc kubenswrapper[4744]: E0311 02:20:11.975643 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:20:15 crc kubenswrapper[4744]: I0311 02:20:15.390607 4744 scope.go:117] "RemoveContainer" containerID="59fe0ddd69ed4d66043f9cd786907504f5ca9fbc1a188bde3b954af3e59fe105" Mar 11 02:20:16 crc kubenswrapper[4744]: I0311 02:20:16.550519 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fc3b8462-612d-4feb-b961-565966784043","Type":"ContainerStarted","Data":"4f3796eb66c05adedbcd36c457e7132a073bdd0e231f4ca2f640fc04c9ea04dc"} Mar 11 02:20:22 crc kubenswrapper[4744]: I0311 02:20:22.974399 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:20:22 crc kubenswrapper[4744]: E0311 02:20:22.975233 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:20:34 crc kubenswrapper[4744]: I0311 02:20:34.500138 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=19.739616019 podStartE2EDuration="27.500108665s" podCreationTimestamp="2026-03-11 02:20:07 +0000 UTC" firstStartedPulling="2026-03-11 02:20:08.401418991 +0000 UTC m=+5165.205636636" lastFinishedPulling="2026-03-11 02:20:16.161911637 +0000 UTC m=+5172.966129282" observedRunningTime="2026-03-11 02:20:16.57643327 +0000 UTC m=+5173.380650915" watchObservedRunningTime="2026-03-11 02:20:34.500108665 +0000 UTC m=+5191.304326310" Mar 11 02:20:34 crc kubenswrapper[4744]: I0311 02:20:34.511082 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:34 crc kubenswrapper[4744]: I0311 02:20:34.511389 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="fc3b8462-612d-4feb-b961-565966784043" containerName="mariadb-client" containerID="cri-o://4f3796eb66c05adedbcd36c457e7132a073bdd0e231f4ca2f640fc04c9ea04dc" gracePeriod=30 Mar 11 02:20:34 crc kubenswrapper[4744]: I0311 02:20:34.706859 4744 generic.go:334] "Generic (PLEG): container finished" podID="fc3b8462-612d-4feb-b961-565966784043" containerID="4f3796eb66c05adedbcd36c457e7132a073bdd0e231f4ca2f640fc04c9ea04dc" exitCode=143 Mar 11 02:20:34 crc kubenswrapper[4744]: I0311 02:20:34.706899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fc3b8462-612d-4feb-b961-565966784043","Type":"ContainerDied","Data":"4f3796eb66c05adedbcd36c457e7132a073bdd0e231f4ca2f640fc04c9ea04dc"} Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.094683 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.277692 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k\") pod \"fc3b8462-612d-4feb-b961-565966784043\" (UID: \"fc3b8462-612d-4feb-b961-565966784043\") " Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.286767 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k" (OuterVolumeSpecName: "kube-api-access-fw99k") pod "fc3b8462-612d-4feb-b961-565966784043" (UID: "fc3b8462-612d-4feb-b961-565966784043"). InnerVolumeSpecName "kube-api-access-fw99k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.379763 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/fc3b8462-612d-4feb-b961-565966784043-kube-api-access-fw99k\") on node \"crc\" DevicePath \"\"" Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.719690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fc3b8462-612d-4feb-b961-565966784043","Type":"ContainerDied","Data":"05f5fd992812203b308a2ffdce4de302ee16c735c5fea730283286edb47c29e8"} Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.719774 4744 scope.go:117] "RemoveContainer" containerID="4f3796eb66c05adedbcd36c457e7132a073bdd0e231f4ca2f640fc04c9ea04dc" Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.719782 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.767331 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.771751 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:20:35 crc kubenswrapper[4744]: I0311 02:20:35.974876 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:20:35 crc kubenswrapper[4744]: E0311 02:20:35.975355 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:20:36 crc kubenswrapper[4744]: I0311 02:20:36.003605 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3b8462-612d-4feb-b961-565966784043" path="/var/lib/kubelet/pods/fc3b8462-612d-4feb-b961-565966784043/volumes" Mar 11 02:20:50 crc kubenswrapper[4744]: I0311 02:20:50.974908 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:20:50 crc kubenswrapper[4744]: E0311 02:20:50.975843 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:21:01 crc kubenswrapper[4744]: I0311 02:21:01.975023 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:21:01 crc kubenswrapper[4744]: E0311 02:21:01.976046 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:21:15 crc kubenswrapper[4744]: I0311 02:21:15.975272 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:21:15 crc kubenswrapper[4744]: E0311 02:21:15.976208 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:21:27 crc kubenswrapper[4744]: I0311 02:21:27.976225 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:21:27 crc kubenswrapper[4744]: E0311 02:21:27.977298 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:21:41 crc kubenswrapper[4744]: I0311 02:21:41.975617 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:21:41 crc kubenswrapper[4744]: E0311 02:21:41.976657 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:21:52 crc kubenswrapper[4744]: I0311 02:21:52.975809 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:21:52 crc kubenswrapper[4744]: E0311 02:21:52.976378 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.174503 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553262-z2v47"] Mar 11 02:22:00 crc kubenswrapper[4744]: E0311 02:22:00.175194 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3b8462-612d-4feb-b961-565966784043" containerName="mariadb-client" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.175211 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3b8462-612d-4feb-b961-565966784043" containerName="mariadb-client" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.175384 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3b8462-612d-4feb-b961-565966784043" containerName="mariadb-client" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.175984 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.178984 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.180897 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.181853 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.194151 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553262-z2v47"] Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.353607 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbsl\" (UniqueName: \"kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl\") pod \"auto-csr-approver-29553262-z2v47\" (UID: \"f7b424dd-5928-4b87-ba28-83de9295aa31\") " pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.455495 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbsl\" (UniqueName: \"kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl\") pod \"auto-csr-approver-29553262-z2v47\" (UID: \"f7b424dd-5928-4b87-ba28-83de9295aa31\") " pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.490082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbsl\" (UniqueName: \"kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl\") pod \"auto-csr-approver-29553262-z2v47\" (UID: \"f7b424dd-5928-4b87-ba28-83de9295aa31\") " pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:00 crc kubenswrapper[4744]: I0311 02:22:00.509550 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:01 crc kubenswrapper[4744]: I0311 02:22:01.011197 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553262-z2v47"] Mar 11 02:22:01 crc kubenswrapper[4744]: W0311 02:22:01.026550 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7b424dd_5928_4b87_ba28_83de9295aa31.slice/crio-b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7 WatchSource:0}: Error finding container b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7: Status 404 returned error can't find the container with id b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7 Mar 11 02:22:01 crc kubenswrapper[4744]: I0311 02:22:01.539744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553262-z2v47" event={"ID":"f7b424dd-5928-4b87-ba28-83de9295aa31","Type":"ContainerStarted","Data":"b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7"} Mar 11 02:22:02 crc kubenswrapper[4744]: I0311 02:22:02.551413 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553262-z2v47" event={"ID":"f7b424dd-5928-4b87-ba28-83de9295aa31","Type":"ContainerStarted","Data":"4208c0a9f8464658ee51ac3024f1ca010869bf0f60c3bdcacab6d78eb48ad219"} Mar 11 02:22:02 crc kubenswrapper[4744]: I0311 02:22:02.571669 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553262-z2v47" podStartSLOduration=1.686597334 podStartE2EDuration="2.57164191s" podCreationTimestamp="2026-03-11 02:22:00 +0000 UTC" firstStartedPulling="2026-03-11 02:22:01.029560356 +0000 UTC m=+5277.833778001" lastFinishedPulling="2026-03-11 02:22:01.914604942 +0000 UTC m=+5278.718822577" observedRunningTime="2026-03-11 02:22:02.564456608 +0000 UTC m=+5279.368674253" watchObservedRunningTime="2026-03-11 02:22:02.57164191 +0000 UTC m=+5279.375859555" Mar 11 02:22:03 crc kubenswrapper[4744]: I0311 02:22:03.563945 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7b424dd-5928-4b87-ba28-83de9295aa31" containerID="4208c0a9f8464658ee51ac3024f1ca010869bf0f60c3bdcacab6d78eb48ad219" exitCode=0 Mar 11 02:22:03 crc kubenswrapper[4744]: I0311 02:22:03.564077 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553262-z2v47" event={"ID":"f7b424dd-5928-4b87-ba28-83de9295aa31","Type":"ContainerDied","Data":"4208c0a9f8464658ee51ac3024f1ca010869bf0f60c3bdcacab6d78eb48ad219"} Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.025348 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.142966 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjbsl\" (UniqueName: \"kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl\") pod \"f7b424dd-5928-4b87-ba28-83de9295aa31\" (UID: \"f7b424dd-5928-4b87-ba28-83de9295aa31\") " Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.151317 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl" (OuterVolumeSpecName: "kube-api-access-wjbsl") pod "f7b424dd-5928-4b87-ba28-83de9295aa31" (UID: "f7b424dd-5928-4b87-ba28-83de9295aa31"). InnerVolumeSpecName "kube-api-access-wjbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.245150 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjbsl\" (UniqueName: \"kubernetes.io/projected/f7b424dd-5928-4b87-ba28-83de9295aa31-kube-api-access-wjbsl\") on node \"crc\" DevicePath \"\"" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.600768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553262-z2v47" event={"ID":"f7b424dd-5928-4b87-ba28-83de9295aa31","Type":"ContainerDied","Data":"b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7"} Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.600865 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b44937fd915ea919b1ce896f9461594f9f970e75712bca9b15b2e37e00d305e7" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.600878 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553262-z2v47" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.670554 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553256-bbwlt"] Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.680717 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553256-bbwlt"] Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.975212 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:22:05 crc kubenswrapper[4744]: E0311 02:22:05.975852 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:22:05 crc kubenswrapper[4744]: I0311 02:22:05.991698 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dc404b-a4cb-49a5-a9e2-450c8ad160cb" path="/var/lib/kubelet/pods/05dc404b-a4cb-49a5-a9e2-450c8ad160cb/volumes" Mar 11 02:22:16 crc kubenswrapper[4744]: I0311 02:22:16.147304 4744 scope.go:117] "RemoveContainer" containerID="ae988813cdc8239c3af7965e7641ef987de1b0eb7eb8655b2c604fcb26a9323e" Mar 11 02:22:16 crc kubenswrapper[4744]: I0311 02:22:16.217847 4744 scope.go:117] "RemoveContainer" containerID="0c5592d8b1416e02e0c169a4c882eed0f2828a0303926738ff99ec427bd73979" Mar 11 02:22:16 crc kubenswrapper[4744]: I0311 02:22:16.974940 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:22:16 crc kubenswrapper[4744]: E0311 02:22:16.975165 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:22:30 crc kubenswrapper[4744]: I0311 02:22:30.975405 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:22:30 crc kubenswrapper[4744]: E0311 02:22:30.976729 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:22:43 crc kubenswrapper[4744]: I0311 02:22:43.974698 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:22:44 crc kubenswrapper[4744]: I0311 02:22:44.992217 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed"} Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.190713 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553264-55vm5"] Mar 11 02:24:00 crc kubenswrapper[4744]: E0311 02:24:00.191852 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b424dd-5928-4b87-ba28-83de9295aa31" containerName="oc" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.191875 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b424dd-5928-4b87-ba28-83de9295aa31" containerName="oc" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.192159 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b424dd-5928-4b87-ba28-83de9295aa31" containerName="oc" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.193056 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.196476 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.197378 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.198334 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.201846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553264-55vm5"] Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.288578 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhnf\" (UniqueName: \"kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf\") pod \"auto-csr-approver-29553264-55vm5\" (UID: \"7e1c5f90-fd1b-4043-8052-f79cc0531a9b\") " pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.391170 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhnf\" (UniqueName: \"kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf\") pod \"auto-csr-approver-29553264-55vm5\" (UID: \"7e1c5f90-fd1b-4043-8052-f79cc0531a9b\") " pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.422988 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhnf\" (UniqueName: \"kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf\") pod \"auto-csr-approver-29553264-55vm5\" (UID: \"7e1c5f90-fd1b-4043-8052-f79cc0531a9b\") " pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.519500 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:00 crc kubenswrapper[4744]: I0311 02:24:00.798436 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553264-55vm5"] Mar 11 02:24:01 crc kubenswrapper[4744]: I0311 02:24:01.773936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553264-55vm5" event={"ID":"7e1c5f90-fd1b-4043-8052-f79cc0531a9b","Type":"ContainerStarted","Data":"91089c1ab49ca398da3ef081129c1ff92b150c6b51485014c8a1b1d722e779dc"} Mar 11 02:24:02 crc kubenswrapper[4744]: I0311 02:24:02.785398 4744 generic.go:334] "Generic (PLEG): container finished" podID="7e1c5f90-fd1b-4043-8052-f79cc0531a9b" containerID="3897da90f7fff9a2b982be7ee015b74df608377bc2da88fc66e46b5c84cb5f26" exitCode=0 Mar 11 02:24:02 crc kubenswrapper[4744]: I0311 02:24:02.785472 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553264-55vm5" event={"ID":"7e1c5f90-fd1b-4043-8052-f79cc0531a9b","Type":"ContainerDied","Data":"3897da90f7fff9a2b982be7ee015b74df608377bc2da88fc66e46b5c84cb5f26"} Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.144556 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.259484 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbhnf\" (UniqueName: \"kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf\") pod \"7e1c5f90-fd1b-4043-8052-f79cc0531a9b\" (UID: \"7e1c5f90-fd1b-4043-8052-f79cc0531a9b\") " Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.269188 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf" (OuterVolumeSpecName: "kube-api-access-cbhnf") pod "7e1c5f90-fd1b-4043-8052-f79cc0531a9b" (UID: "7e1c5f90-fd1b-4043-8052-f79cc0531a9b"). InnerVolumeSpecName "kube-api-access-cbhnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.362081 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbhnf\" (UniqueName: \"kubernetes.io/projected/7e1c5f90-fd1b-4043-8052-f79cc0531a9b-kube-api-access-cbhnf\") on node \"crc\" DevicePath \"\"" Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.808422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553264-55vm5" event={"ID":"7e1c5f90-fd1b-4043-8052-f79cc0531a9b","Type":"ContainerDied","Data":"91089c1ab49ca398da3ef081129c1ff92b150c6b51485014c8a1b1d722e779dc"} Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.808487 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91089c1ab49ca398da3ef081129c1ff92b150c6b51485014c8a1b1d722e779dc" Mar 11 02:24:04 crc kubenswrapper[4744]: I0311 02:24:04.808557 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553264-55vm5" Mar 11 02:24:05 crc kubenswrapper[4744]: I0311 02:24:05.247127 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553258-fkvdg"] Mar 11 02:24:05 crc kubenswrapper[4744]: I0311 02:24:05.257284 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553258-fkvdg"] Mar 11 02:24:05 crc kubenswrapper[4744]: I0311 02:24:05.991644 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ca5b5c-1076-451e-965c-62cf0ba58592" path="/var/lib/kubelet/pods/d6ca5b5c-1076-451e-965c-62cf0ba58592/volumes" Mar 11 02:24:16 crc kubenswrapper[4744]: I0311 02:24:16.331239 4744 scope.go:117] "RemoveContainer" containerID="24928f6076d279cd331d498bcbc05cd86f603a9a110faedb11c70c611144b9b5" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.240481 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 11 02:24:28 crc kubenswrapper[4744]: E0311 02:24:28.241425 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1c5f90-fd1b-4043-8052-f79cc0531a9b" containerName="oc" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.241442 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1c5f90-fd1b-4043-8052-f79cc0531a9b" containerName="oc" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.241669 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1c5f90-fd1b-4043-8052-f79cc0531a9b" containerName="oc" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.242295 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.245776 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cz2v8" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.254140 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.944747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:28 crc kubenswrapper[4744]: I0311 02:24:28.945480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm2xx\" (UniqueName: \"kubernetes.io/projected/57813bc2-80d5-486e-8258-32b184f74ed6-kube-api-access-gm2xx\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.046608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.046641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm2xx\" (UniqueName: \"kubernetes.io/projected/57813bc2-80d5-486e-8258-32b184f74ed6-kube-api-access-gm2xx\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.049857 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.049999 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f7cab39913e071a22e64bd074da18c9fc0b6ab044c7670d003aa9649cbda89a/globalmount\"" pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.068563 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm2xx\" (UniqueName: \"kubernetes.io/projected/57813bc2-80d5-486e-8258-32b184f74ed6-kube-api-access-gm2xx\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.086986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56b3cdd9-2c76-4819-ad58-add91bcfec06\") pod \"mariadb-copy-data\" (UID: \"57813bc2-80d5-486e-8258-32b184f74ed6\") " pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.162144 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 11 02:24:29 crc kubenswrapper[4744]: I0311 02:24:29.508219 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 11 02:24:30 crc kubenswrapper[4744]: I0311 02:24:30.053611 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"57813bc2-80d5-486e-8258-32b184f74ed6","Type":"ContainerStarted","Data":"3541fd61118dd1fbcf20e415c319cad4ab3eb19895413b0af4c14d66be2ed9c3"} Mar 11 02:24:30 crc kubenswrapper[4744]: I0311 02:24:30.053913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"57813bc2-80d5-486e-8258-32b184f74ed6","Type":"ContainerStarted","Data":"4a49dea885555d0c1ffbd8fe79c93e8abea4615d90b9862d038c9728ed651662"} Mar 11 02:24:30 crc kubenswrapper[4744]: I0311 02:24:30.082154 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.082133035 podStartE2EDuration="3.082133035s" podCreationTimestamp="2026-03-11 02:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:24:30.080735803 +0000 UTC m=+5426.884953448" watchObservedRunningTime="2026-03-11 02:24:30.082133035 +0000 UTC m=+5426.886350650" Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.267048 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.270312 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.277387 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.428129 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bdm\" (UniqueName: \"kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm\") pod \"mariadb-client\" (UID: \"2beee4a1-471b-46c2-960a-3e9f8df4cfd1\") " pod="openstack/mariadb-client" Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.530143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bdm\" (UniqueName: \"kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm\") pod \"mariadb-client\" (UID: \"2beee4a1-471b-46c2-960a-3e9f8df4cfd1\") " pod="openstack/mariadb-client" Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.559664 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bdm\" (UniqueName: \"kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm\") pod \"mariadb-client\" (UID: \"2beee4a1-471b-46c2-960a-3e9f8df4cfd1\") " pod="openstack/mariadb-client" Mar 11 02:24:33 crc kubenswrapper[4744]: I0311 02:24:33.597036 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:34 crc kubenswrapper[4744]: I0311 02:24:34.097323 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:34 crc kubenswrapper[4744]: W0311 02:24:34.097628 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2beee4a1_471b_46c2_960a_3e9f8df4cfd1.slice/crio-e57362cd414d0b1d4c1a5ef582efbe6db1448c4d82acc46ea24369d976f9d2fd WatchSource:0}: Error finding container e57362cd414d0b1d4c1a5ef582efbe6db1448c4d82acc46ea24369d976f9d2fd: Status 404 returned error can't find the container with id e57362cd414d0b1d4c1a5ef582efbe6db1448c4d82acc46ea24369d976f9d2fd Mar 11 02:24:35 crc kubenswrapper[4744]: I0311 02:24:35.103153 4744 generic.go:334] "Generic (PLEG): container finished" podID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" containerID="e00093d931af12eea3fd89b3a8fd324d5304ee7099841b353d9ed35509df9a39" exitCode=0 Mar 11 02:24:35 crc kubenswrapper[4744]: I0311 02:24:35.103284 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2beee4a1-471b-46c2-960a-3e9f8df4cfd1","Type":"ContainerDied","Data":"e00093d931af12eea3fd89b3a8fd324d5304ee7099841b353d9ed35509df9a39"} Mar 11 02:24:35 crc kubenswrapper[4744]: I0311 02:24:35.103551 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2beee4a1-471b-46c2-960a-3e9f8df4cfd1","Type":"ContainerStarted","Data":"e57362cd414d0b1d4c1a5ef582efbe6db1448c4d82acc46ea24369d976f9d2fd"} Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.488108 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.512458 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_2beee4a1-471b-46c2-960a-3e9f8df4cfd1/mariadb-client/0.log" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.540769 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.546097 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.596275 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6bdm\" (UniqueName: \"kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm\") pod \"2beee4a1-471b-46c2-960a-3e9f8df4cfd1\" (UID: \"2beee4a1-471b-46c2-960a-3e9f8df4cfd1\") " Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.602030 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm" (OuterVolumeSpecName: "kube-api-access-t6bdm") pod "2beee4a1-471b-46c2-960a-3e9f8df4cfd1" (UID: "2beee4a1-471b-46c2-960a-3e9f8df4cfd1"). InnerVolumeSpecName "kube-api-access-t6bdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.697933 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6bdm\" (UniqueName: \"kubernetes.io/projected/2beee4a1-471b-46c2-960a-3e9f8df4cfd1-kube-api-access-t6bdm\") on node \"crc\" DevicePath \"\"" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.703346 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:36 crc kubenswrapper[4744]: E0311 02:24:36.703720 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" containerName="mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.703742 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" containerName="mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.703934 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" containerName="mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.704465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.720545 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.799416 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8vj\" (UniqueName: \"kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj\") pod \"mariadb-client\" (UID: \"d036d416-56ea-41c8-9b75-d8d97f31c493\") " pod="openstack/mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.900825 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8vj\" (UniqueName: \"kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj\") pod \"mariadb-client\" (UID: \"d036d416-56ea-41c8-9b75-d8d97f31c493\") " pod="openstack/mariadb-client" Mar 11 02:24:36 crc kubenswrapper[4744]: I0311 02:24:36.926204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8vj\" (UniqueName: \"kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj\") pod \"mariadb-client\" (UID: \"d036d416-56ea-41c8-9b75-d8d97f31c493\") " pod="openstack/mariadb-client" Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.032169 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.125002 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e57362cd414d0b1d4c1a5ef582efbe6db1448c4d82acc46ea24369d976f9d2fd" Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.125070 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.152479 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" podUID="d036d416-56ea-41c8-9b75-d8d97f31c493" Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.352201 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:37 crc kubenswrapper[4744]: W0311 02:24:37.357185 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd036d416_56ea_41c8_9b75_d8d97f31c493.slice/crio-b5751f45dd4ae1321c4d017fb332072050ab9e960fda4cad8e7e7dcd46f618dc WatchSource:0}: Error finding container b5751f45dd4ae1321c4d017fb332072050ab9e960fda4cad8e7e7dcd46f618dc: Status 404 returned error can't find the container with id b5751f45dd4ae1321c4d017fb332072050ab9e960fda4cad8e7e7dcd46f618dc Mar 11 02:24:37 crc kubenswrapper[4744]: I0311 02:24:37.986335 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2beee4a1-471b-46c2-960a-3e9f8df4cfd1" path="/var/lib/kubelet/pods/2beee4a1-471b-46c2-960a-3e9f8df4cfd1/volumes" Mar 11 02:24:38 crc kubenswrapper[4744]: I0311 02:24:38.136896 4744 generic.go:334] "Generic (PLEG): container finished" podID="d036d416-56ea-41c8-9b75-d8d97f31c493" containerID="d4e208856c71c2bb744e4a3393ad7bc0021795c1ee347a328e95d534bd8ae6a5" exitCode=0 Mar 11 02:24:38 crc kubenswrapper[4744]: I0311 02:24:38.136959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d036d416-56ea-41c8-9b75-d8d97f31c493","Type":"ContainerDied","Data":"d4e208856c71c2bb744e4a3393ad7bc0021795c1ee347a328e95d534bd8ae6a5"} Mar 11 02:24:38 crc kubenswrapper[4744]: I0311 02:24:38.137038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d036d416-56ea-41c8-9b75-d8d97f31c493","Type":"ContainerStarted","Data":"b5751f45dd4ae1321c4d017fb332072050ab9e960fda4cad8e7e7dcd46f618dc"} Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.525059 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.548615 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d036d416-56ea-41c8-9b75-d8d97f31c493/mariadb-client/0.log" Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.586664 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.597491 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.650911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8vj\" (UniqueName: \"kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj\") pod \"d036d416-56ea-41c8-9b75-d8d97f31c493\" (UID: \"d036d416-56ea-41c8-9b75-d8d97f31c493\") " Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.665314 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj" (OuterVolumeSpecName: "kube-api-access-dv8vj") pod "d036d416-56ea-41c8-9b75-d8d97f31c493" (UID: "d036d416-56ea-41c8-9b75-d8d97f31c493"). InnerVolumeSpecName "kube-api-access-dv8vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.754221 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8vj\" (UniqueName: \"kubernetes.io/projected/d036d416-56ea-41c8-9b75-d8d97f31c493-kube-api-access-dv8vj\") on node \"crc\" DevicePath \"\"" Mar 11 02:24:39 crc kubenswrapper[4744]: I0311 02:24:39.987158 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d036d416-56ea-41c8-9b75-d8d97f31c493" path="/var/lib/kubelet/pods/d036d416-56ea-41c8-9b75-d8d97f31c493/volumes" Mar 11 02:24:40 crc kubenswrapper[4744]: I0311 02:24:40.160113 4744 scope.go:117] "RemoveContainer" containerID="d4e208856c71c2bb744e4a3393ad7bc0021795c1ee347a328e95d534bd8ae6a5" Mar 11 02:24:40 crc kubenswrapper[4744]: I0311 02:24:40.160161 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 11 02:25:11 crc kubenswrapper[4744]: I0311 02:25:11.778028 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-65fd2" podUID="21573ca2-d902-4d30-b94a-7b5ae891e084" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.168288 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.168571 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.367081 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:13 crc kubenswrapper[4744]: E0311 02:25:13.367558 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d036d416-56ea-41c8-9b75-d8d97f31c493" containerName="mariadb-client" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.367583 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d036d416-56ea-41c8-9b75-d8d97f31c493" containerName="mariadb-client" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.367861 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d036d416-56ea-41c8-9b75-d8d97f31c493" containerName="mariadb-client" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.371835 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.399585 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.413459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.413613 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmbx\" (UniqueName: \"kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.413665 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.515330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.515660 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmbx\" (UniqueName: \"kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.515692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.515851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.515954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.534297 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmbx\" (UniqueName: \"kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx\") pod \"redhat-marketplace-kwr6h\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:13 crc kubenswrapper[4744]: I0311 02:25:13.715228 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:14 crc kubenswrapper[4744]: I0311 02:25:14.380692 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:15 crc kubenswrapper[4744]: I0311 02:25:15.256693 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerID="34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83" exitCode=0 Mar 11 02:25:15 crc kubenswrapper[4744]: I0311 02:25:15.256817 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerDied","Data":"34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83"} Mar 11 02:25:15 crc kubenswrapper[4744]: I0311 02:25:15.257503 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerStarted","Data":"19796aa5c22cd014a94700b9f240f679c14288adb8fda06c91139f729c633ee3"} Mar 11 02:25:15 crc kubenswrapper[4744]: I0311 02:25:15.259266 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:25:16 crc kubenswrapper[4744]: I0311 02:25:16.273235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerStarted","Data":"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a"} Mar 11 02:25:16 crc kubenswrapper[4744]: I0311 02:25:16.430927 4744 scope.go:117] "RemoveContainer" containerID="5770b245a42de80a839f71f12517e543cd28cf787dba4568bd88ef87c3f896d8" Mar 11 02:25:17 crc kubenswrapper[4744]: I0311 02:25:17.288836 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerID="65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a" exitCode=0 Mar 11 02:25:17 crc kubenswrapper[4744]: I0311 02:25:17.288903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerDied","Data":"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a"} Mar 11 02:25:18 crc kubenswrapper[4744]: I0311 02:25:18.301378 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerStarted","Data":"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832"} Mar 11 02:25:18 crc kubenswrapper[4744]: I0311 02:25:18.337875 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwr6h" podStartSLOduration=2.916720898 podStartE2EDuration="5.337838911s" podCreationTimestamp="2026-03-11 02:25:13 +0000 UTC" firstStartedPulling="2026-03-11 02:25:15.258851537 +0000 UTC m=+5472.063069182" lastFinishedPulling="2026-03-11 02:25:17.67996955 +0000 UTC m=+5474.484187195" observedRunningTime="2026-03-11 02:25:18.330882947 +0000 UTC m=+5475.135100592" watchObservedRunningTime="2026-03-11 02:25:18.337838911 +0000 UTC m=+5475.142056556" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.908903 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.911144 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.917588 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.917639 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m4dsq" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.917781 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.918272 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.920997 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.923845 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.925922 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.931317 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.937595 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.938972 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.952218 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 11 02:25:19 crc kubenswrapper[4744]: I0311 02:25:19.961590 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045201 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-config\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fm5\" (UniqueName: \"kubernetes.io/projected/371fb375-551c-4443-8f31-5e952e527f4d-kube-api-access-44fm5\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045302 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045328 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2bz\" (UniqueName: \"kubernetes.io/projected/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-kube-api-access-lt2bz\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045380 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045401 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/371fb375-551c-4443-8f31-5e952e527f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045467 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045502 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045569 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045600 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-config\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwqhb\" (UniqueName: \"kubernetes.io/projected/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-kube-api-access-lwqhb\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045700 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045719 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045738 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045830 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045853 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06c05069-bb6d-4780-9d78-979e83ea950d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06c05069-bb6d-4780-9d78-979e83ea950d\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045901 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.045924 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147244 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147311 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2bz\" (UniqueName: \"kubernetes.io/projected/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-kube-api-access-lt2bz\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147379 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147400 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/371fb375-551c-4443-8f31-5e952e527f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.147433 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/371fb375-551c-4443-8f31-5e952e527f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148309 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148368 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-config\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwqhb\" (UniqueName: \"kubernetes.io/projected/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-kube-api-access-lwqhb\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148415 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148450 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148487 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148505 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06c05069-bb6d-4780-9d78-979e83ea950d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06c05069-bb6d-4780-9d78-979e83ea950d\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148584 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148635 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-config\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.148753 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fm5\" (UniqueName: \"kubernetes.io/projected/371fb375-551c-4443-8f31-5e952e527f4d-kube-api-access-44fm5\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.149595 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.150467 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.150762 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.151900 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-config\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.151932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371fb375-551c-4443-8f31-5e952e527f4d-config\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.152277 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.152598 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.153882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.156987 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157032 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157080 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1ee6c9e2323cce28f1cf7bdf311603a9c994bbe7dd7afc7cf1f09b4ec226acab/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157174 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157212 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157031 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06c05069-bb6d-4780-9d78-979e83ea950d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06c05069-bb6d-4780-9d78-979e83ea950d\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53e6a11d513ffe79c02ffeab69ac8f25fe9f05df70fb3795ff89daf150e89f4e/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.157877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.158753 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.158950 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/624d5fc82395a121f737cdafb11eb5c03dc1c18ac1eb067d4a10f1509c068a0a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.162824 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/371fb375-551c-4443-8f31-5e952e527f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.164099 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.166065 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.169137 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.170200 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.175778 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2bz\" (UniqueName: \"kubernetes.io/projected/e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013-kube-api-access-lt2bz\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.177478 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwqhb\" (UniqueName: \"kubernetes.io/projected/d5efa1d3-6643-4d3e-a06b-2051d5f1664a-kube-api-access-lwqhb\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.186414 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fm5\" (UniqueName: \"kubernetes.io/projected/371fb375-551c-4443-8f31-5e952e527f4d-kube-api-access-44fm5\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.234644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06c05069-bb6d-4780-9d78-979e83ea950d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06c05069-bb6d-4780-9d78-979e83ea950d\") pod \"ovsdbserver-sb-2\" (UID: \"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013\") " pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.235630 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45b9c915-1721-44a4-8839-8dfba6c701ea\") pod \"ovsdbserver-sb-1\" (UID: \"371fb375-551c-4443-8f31-5e952e527f4d\") " pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.238856 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33eb8ecb-2384-4df5-878d-ec11d2edba5b\") pod \"ovsdbserver-sb-0\" (UID: \"d5efa1d3-6643-4d3e-a06b-2051d5f1664a\") " pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.260222 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.272090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.540181 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.572094 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.573406 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.577983 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jvv9t" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.578012 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.578451 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.579407 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.598808 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.606536 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.607842 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.613585 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.614611 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.632737 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.639411 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656725 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5nk\" (UniqueName: \"kubernetes.io/projected/6a398ee8-1d34-45b7-a50b-82d6880407d2-kube-api-access-mc5nk\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656799 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656858 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.656902 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6s9\" (UniqueName: \"kubernetes.io/projected/69276227-3db4-4f45-b4bf-3595c388000b-kube-api-access-bm6s9\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759160 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5nk\" (UniqueName: \"kubernetes.io/projected/6a398ee8-1d34-45b7-a50b-82d6880407d2-kube-api-access-mc5nk\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759233 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-config\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e405410d-1481-4204-8a8c-75b0e68f727c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e405410d-1481-4204-8a8c-75b0e68f727c\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759522 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759544 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69276227-3db4-4f45-b4bf-3595c388000b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759559 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759608 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759626 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-config\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759658 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759674 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759709 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759751 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759923 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8wwx\" (UniqueName: \"kubernetes.io/projected/95b1b29d-10d0-4950-8b51-322683a204ad-kube-api-access-r8wwx\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759942 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.759984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.760002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.760024 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.761010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.761426 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.761916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a398ee8-1d34-45b7-a50b-82d6880407d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.764657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.765355 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.765380 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/796db270c4475dc3efc7e9c939272d631b28eb1aded3185d2fa56256f7657b4e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.765980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.768750 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a398ee8-1d34-45b7-a50b-82d6880407d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.775541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5nk\" (UniqueName: \"kubernetes.io/projected/6a398ee8-1d34-45b7-a50b-82d6880407d2-kube-api-access-mc5nk\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: W0311 02:25:20.812449 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9cc891c_dd1f_4f59_aa6f_9b51f8ae9013.slice/crio-076d927292cd4b08fd647a551aae7aea3b28a99ab73c8efb0973ef695e95aca7 WatchSource:0}: Error finding container 076d927292cd4b08fd647a551aae7aea3b28a99ab73c8efb0973ef695e95aca7: Status 404 returned error can't find the container with id 076d927292cd4b08fd647a551aae7aea3b28a99ab73c8efb0973ef695e95aca7 Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.812892 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.828817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee15808-256f-430f-8bc3-c634fbc6d459\") pod \"ovsdbserver-nb-0\" (UID: \"6a398ee8-1d34-45b7-a50b-82d6880407d2\") " pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861754 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861847 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8wwx\" (UniqueName: \"kubernetes.io/projected/95b1b29d-10d0-4950-8b51-322683a204ad-kube-api-access-r8wwx\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861917 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.861950 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6s9\" (UniqueName: \"kubernetes.io/projected/69276227-3db4-4f45-b4bf-3595c388000b-kube-api-access-bm6s9\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862027 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-config\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e405410d-1481-4204-8a8c-75b0e68f727c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e405410d-1481-4204-8a8c-75b0e68f727c\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862147 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862169 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69276227-3db4-4f45-b4bf-3595c388000b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862189 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862233 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862253 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-config\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.862276 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.863301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69276227-3db4-4f45-b4bf-3595c388000b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.863682 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-config\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.863707 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95b1b29d-10d0-4950-8b51-322683a204ad-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.863844 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.864235 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-config\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.865044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69276227-3db4-4f45-b4bf-3595c388000b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.866262 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.866891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.867274 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.867361 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2870c42d9e434b4cf0451f867a27c4f601bb4f9aa97658cf194bc785987ca6a7/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.867418 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.868179 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.868219 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e405410d-1481-4204-8a8c-75b0e68f727c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e405410d-1481-4204-8a8c-75b0e68f727c\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8c94ad0dcdc5527ad8d6b27558b682342d58e5ca96adbd8c5493e81be9330808/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.868454 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.869723 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b1b29d-10d0-4950-8b51-322683a204ad-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.875436 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69276227-3db4-4f45-b4bf-3595c388000b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.883896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8wwx\" (UniqueName: \"kubernetes.io/projected/95b1b29d-10d0-4950-8b51-322683a204ad-kube-api-access-r8wwx\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.889808 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6s9\" (UniqueName: \"kubernetes.io/projected/69276227-3db4-4f45-b4bf-3595c388000b-kube-api-access-bm6s9\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.901288 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e417c620-0e5d-4eb2-b761-1c021aab6583\") pod \"ovsdbserver-nb-2\" (UID: \"69276227-3db4-4f45-b4bf-3595c388000b\") " pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.914679 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.922555 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.924949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e405410d-1481-4204-8a8c-75b0e68f727c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e405410d-1481-4204-8a8c-75b0e68f727c\") pod \"ovsdbserver-nb-1\" (UID: \"95b1b29d-10d0-4950-8b51-322683a204ad\") " pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.938005 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:20 crc kubenswrapper[4744]: I0311 02:25:20.946662 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.070632 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.327555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013","Type":"ContainerStarted","Data":"b8d85be2186235a4bbed323a9f9ea48db5033e34f6147fa754a4751eab99ee34"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.327844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013","Type":"ContainerStarted","Data":"011fcf3d3da2744de3d7d6b418e2dc01c405b8c6701bb263a5a0eef9059949fc"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.327860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013","Type":"ContainerStarted","Data":"076d927292cd4b08fd647a551aae7aea3b28a99ab73c8efb0973ef695e95aca7"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.329065 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"371fb375-551c-4443-8f31-5e952e527f4d","Type":"ContainerStarted","Data":"55d8c0875aa5b67f8a54e23fd2d908c963ebb96428dfd4039816b48a244aa908"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.329105 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"371fb375-551c-4443-8f31-5e952e527f4d","Type":"ContainerStarted","Data":"75300b15d526b574761809e0d31270fc81bed9b2af30ba6def14d823931dc893"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.331363 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5efa1d3-6643-4d3e-a06b-2051d5f1664a","Type":"ContainerStarted","Data":"ba9dee5cd23ff3aa7694a7935cc47352947b8ae6c2643a81178e7c5bee7249d6"} Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.351365 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.351343199 podStartE2EDuration="3.351343199s" podCreationTimestamp="2026-03-11 02:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:21.348088638 +0000 UTC m=+5478.152306253" watchObservedRunningTime="2026-03-11 02:25:21.351343199 +0000 UTC m=+5478.155560804" Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.445012 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 02:25:21 crc kubenswrapper[4744]: W0311 02:25:21.451156 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a398ee8_1d34_45b7_a50b_82d6880407d2.slice/crio-08ec8369987f87bc1b48e47df28f85786ba1bd5386936eeb07a5d3580f6aa351 WatchSource:0}: Error finding container 08ec8369987f87bc1b48e47df28f85786ba1bd5386936eeb07a5d3580f6aa351: Status 404 returned error can't find the container with id 08ec8369987f87bc1b48e47df28f85786ba1bd5386936eeb07a5d3580f6aa351 Mar 11 02:25:21 crc kubenswrapper[4744]: I0311 02:25:21.548252 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.344789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a398ee8-1d34-45b7-a50b-82d6880407d2","Type":"ContainerStarted","Data":"10ebdbeaee102073622903d0a0269ccf545e5e3789373ceeb452947c844484a9"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.345235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a398ee8-1d34-45b7-a50b-82d6880407d2","Type":"ContainerStarted","Data":"e58bcf960fd653d833e9611bcf82f072598b9dc6fcf675ab7d2f02ed421fde9e"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.345277 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a398ee8-1d34-45b7-a50b-82d6880407d2","Type":"ContainerStarted","Data":"08ec8369987f87bc1b48e47df28f85786ba1bd5386936eeb07a5d3580f6aa351"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.347830 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"371fb375-551c-4443-8f31-5e952e527f4d","Type":"ContainerStarted","Data":"be76c02f63578aba1f9a741b20a4179ee34867a2cd9fda3f230d98d2ae8b1a47"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.350139 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5efa1d3-6643-4d3e-a06b-2051d5f1664a","Type":"ContainerStarted","Data":"1f9c9923d7d3ced8b5a845227793c416c84c6f0e6de80b7cffeef9299620edc6"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.350188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5efa1d3-6643-4d3e-a06b-2051d5f1664a","Type":"ContainerStarted","Data":"6defe27d045d7acdbe00474c01b5ed4f71cb9077ead269bc095e7266a487869b"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.353146 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"95b1b29d-10d0-4950-8b51-322683a204ad","Type":"ContainerStarted","Data":"500d883e542e4841e6832c92bb653e7f857960f21421c64d1d7c476696dffa5d"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.353207 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"95b1b29d-10d0-4950-8b51-322683a204ad","Type":"ContainerStarted","Data":"8d0e914fd940adb96a78ba81d6126aeb1d84225ac9d81159d24b95477023d2df"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.353236 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"95b1b29d-10d0-4950-8b51-322683a204ad","Type":"ContainerStarted","Data":"9f912e63a64adc42ec2e5699c1bfb908af93860496faeee00dfe12a944ba9ff4"} Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.391391 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.391352938 podStartE2EDuration="3.391352938s" podCreationTimestamp="2026-03-11 02:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:22.373013313 +0000 UTC m=+5479.177230978" watchObservedRunningTime="2026-03-11 02:25:22.391352938 +0000 UTC m=+5479.195570583" Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.416935 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.416906895 podStartE2EDuration="4.416906895s" podCreationTimestamp="2026-03-11 02:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:22.395265698 +0000 UTC m=+5479.199483333" watchObservedRunningTime="2026-03-11 02:25:22.416906895 +0000 UTC m=+5479.221124540" Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.434241 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.434212398 podStartE2EDuration="3.434212398s" podCreationTimestamp="2026-03-11 02:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:22.422359603 +0000 UTC m=+5479.226577248" watchObservedRunningTime="2026-03-11 02:25:22.434212398 +0000 UTC m=+5479.238430023" Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.458494 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.458472455 podStartE2EDuration="4.458472455s" podCreationTimestamp="2026-03-11 02:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:22.455381699 +0000 UTC m=+5479.259599344" watchObservedRunningTime="2026-03-11 02:25:22.458472455 +0000 UTC m=+5479.262690070" Mar 11 02:25:22 crc kubenswrapper[4744]: I0311 02:25:22.486848 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.261920 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.272254 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.328002 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.365873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"69276227-3db4-4f45-b4bf-3595c388000b","Type":"ContainerStarted","Data":"9210cc49673e824d1c40c097fa3e458ce9b40fa5a4c4fcebd257817ced7de57e"} Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.365919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"69276227-3db4-4f45-b4bf-3595c388000b","Type":"ContainerStarted","Data":"819e0f66fe4003cfffd8f0266a4bf5dcf8a693e1cba86510c1e86d6e1e0cbde2"} Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.365932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"69276227-3db4-4f45-b4bf-3595c388000b","Type":"ContainerStarted","Data":"a994597b849756432a67cc531faedeb99f0653ef263b8ce9d45656c17ff916e8"} Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.367885 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.400407 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.400386153 podStartE2EDuration="4.400386153s" podCreationTimestamp="2026-03-11 02:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:23.390694324 +0000 UTC m=+5480.194911969" watchObservedRunningTime="2026-03-11 02:25:23.400386153 +0000 UTC m=+5480.204603778" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.540704 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.715964 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.716696 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.791297 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.923505 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.938540 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:23 crc kubenswrapper[4744]: I0311 02:25:23.947809 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:24 crc kubenswrapper[4744]: I0311 02:25:24.462988 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:24 crc kubenswrapper[4744]: I0311 02:25:24.537648 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.273145 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.310397 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.540972 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.632367 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.633629 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.636868 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.643325 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.750690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlr52\" (UniqueName: \"kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.751004 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.751135 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.751241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.853101 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.853341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.853451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.853596 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlr52\" (UniqueName: \"kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.854441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.854527 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.855339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.887846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlr52\" (UniqueName: \"kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52\") pod \"dnsmasq-dns-55f698f5f-5sf7b\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.923693 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.938776 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.948589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:25 crc kubenswrapper[4744]: I0311 02:25:25.949710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.341411 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.394466 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwr6h" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="registry-server" containerID="cri-o://4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832" gracePeriod=2 Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.421393 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.468100 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.588762 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.671410 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.854188 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.966908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.976283 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content\") pod \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.976590 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities\") pod \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.976746 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmbx\" (UniqueName: \"kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx\") pod \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\" (UID: \"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80\") " Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.978432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities" (OuterVolumeSpecName: "utilities") pod "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" (UID: "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.980560 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx" (OuterVolumeSpecName: "kube-api-access-hjmbx") pod "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" (UID: "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80"). InnerVolumeSpecName "kube-api-access-hjmbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.980814 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:26 crc kubenswrapper[4744]: I0311 02:25:26.981195 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.004687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" (UID: "8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.008115 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.020420 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.078937 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.078968 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.078982 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmbx\" (UniqueName: \"kubernetes.io/projected/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80-kube-api-access-hjmbx\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.311703 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.327952 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.328247 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="extract-utilities" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.328266 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="extract-utilities" Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.328278 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="registry-server" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.328284 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="registry-server" Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.328305 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="extract-content" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.328312 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="extract-content" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.328466 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerName="registry-server" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.331959 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.334687 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.366614 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.383075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.383132 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcwd\" (UniqueName: \"kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.383157 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.383199 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.383249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.404226 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerID="8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe" exitCode=0 Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.404306 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" event={"ID":"7c6cd15f-e462-4531-8034-5a4da5b67cb0","Type":"ContainerDied","Data":"8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe"} Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.404338 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" event={"ID":"7c6cd15f-e462-4531-8034-5a4da5b67cb0","Type":"ContainerStarted","Data":"b2bc2404055ab4981f803364856bad59db105ae07a161cc04baa9143088a276c"} Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.408390 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" containerID="4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832" exitCode=0 Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.408471 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwr6h" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.408523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerDied","Data":"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832"} Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.408666 4744 scope.go:117] "RemoveContainer" containerID="4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.409124 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwr6h" event={"ID":"8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80","Type":"ContainerDied","Data":"19796aa5c22cd014a94700b9f240f679c14288adb8fda06c91139f729c633ee3"} Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.452950 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.493858 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.494008 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcwd\" (UniqueName: \"kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.494236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.494376 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.494618 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.497128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.501780 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.504465 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.553350 4744 scope.go:117] "RemoveContainer" containerID="65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.562293 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcwd\" (UniqueName: \"kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.571597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc\") pod \"dnsmasq-dns-7c76c8f9c5-hc9kl\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.599323 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.606181 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwr6h"] Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.657062 4744 scope.go:117] "RemoveContainer" containerID="34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.657569 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.730661 4744 scope.go:117] "RemoveContainer" containerID="4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832" Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.731197 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832\": container with ID starting with 4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832 not found: ID does not exist" containerID="4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.731225 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832"} err="failed to get container status \"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832\": rpc error: code = NotFound desc = could not find container \"4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832\": container with ID starting with 4952f6bd0f94a82bdde664b23670bad982b6e617d556cdff2471d4487d198832 not found: ID does not exist" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.731243 4744 scope.go:117] "RemoveContainer" containerID="65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a" Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.731450 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a\": container with ID starting with 65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a not found: ID does not exist" containerID="65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.731483 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a"} err="failed to get container status \"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a\": rpc error: code = NotFound desc = could not find container \"65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a\": container with ID starting with 65c01ccab4e886c72c8141100f9793603557fab2d48b3854eb1f82669f4a575a not found: ID does not exist" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.731509 4744 scope.go:117] "RemoveContainer" containerID="34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83" Mar 11 02:25:27 crc kubenswrapper[4744]: E0311 02:25:27.731998 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83\": container with ID starting with 34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83 not found: ID does not exist" containerID="34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.732031 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83"} err="failed to get container status \"34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83\": rpc error: code = NotFound desc = could not find container \"34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83\": container with ID starting with 34b4755a23c77de86dc7fc7905fa5314caa01f7eabaf013f0a8ed424523e2a83 not found: ID does not exist" Mar 11 02:25:27 crc kubenswrapper[4744]: I0311 02:25:27.985392 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80" path="/var/lib/kubelet/pods/8bf686cc-4b5a-4ff3-a1ed-32f1520cbf80/volumes" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.138740 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.418188 4744 generic.go:334] "Generic (PLEG): container finished" podID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerID="1049a69a35821280aaf43a1fac4134418d68046771aaeedcf51feb8bb6b1ef56" exitCode=0 Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.418333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" event={"ID":"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24","Type":"ContainerDied","Data":"1049a69a35821280aaf43a1fac4134418d68046771aaeedcf51feb8bb6b1ef56"} Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.418621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" event={"ID":"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24","Type":"ContainerStarted","Data":"7628c42b959d40b90261e261cd00b42518b2c4e61e0c7c19d7558bdff8214ac9"} Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.422235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" event={"ID":"7c6cd15f-e462-4531-8034-5a4da5b67cb0","Type":"ContainerStarted","Data":"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103"} Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.422561 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="dnsmasq-dns" containerID="cri-o://d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103" gracePeriod=10 Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.422711 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.479059 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" podStartSLOduration=3.479041572 podStartE2EDuration="3.479041572s" podCreationTimestamp="2026-03-11 02:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:28.477341569 +0000 UTC m=+5485.281559204" watchObservedRunningTime="2026-03-11 02:25:28.479041572 +0000 UTC m=+5485.283259187" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.795225 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.836729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlr52\" (UniqueName: \"kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52\") pod \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.836774 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config\") pod \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.836803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc\") pod \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.836836 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb\") pod \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\" (UID: \"7c6cd15f-e462-4531-8034-5a4da5b67cb0\") " Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.847294 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52" (OuterVolumeSpecName: "kube-api-access-vlr52") pod "7c6cd15f-e462-4531-8034-5a4da5b67cb0" (UID: "7c6cd15f-e462-4531-8034-5a4da5b67cb0"). InnerVolumeSpecName "kube-api-access-vlr52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.891235 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config" (OuterVolumeSpecName: "config") pod "7c6cd15f-e462-4531-8034-5a4da5b67cb0" (UID: "7c6cd15f-e462-4531-8034-5a4da5b67cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.896484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c6cd15f-e462-4531-8034-5a4da5b67cb0" (UID: "7c6cd15f-e462-4531-8034-5a4da5b67cb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.913920 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c6cd15f-e462-4531-8034-5a4da5b67cb0" (UID: "7c6cd15f-e462-4531-8034-5a4da5b67cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.938276 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.938313 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.938328 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlr52\" (UniqueName: \"kubernetes.io/projected/7c6cd15f-e462-4531-8034-5a4da5b67cb0-kube-api-access-vlr52\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:28 crc kubenswrapper[4744]: I0311 02:25:28.938341 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6cd15f-e462-4531-8034-5a4da5b67cb0-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.435608 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerID="d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103" exitCode=0 Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.435677 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" event={"ID":"7c6cd15f-e462-4531-8034-5a4da5b67cb0","Type":"ContainerDied","Data":"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103"} Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.436045 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" event={"ID":"7c6cd15f-e462-4531-8034-5a4da5b67cb0","Type":"ContainerDied","Data":"b2bc2404055ab4981f803364856bad59db105ae07a161cc04baa9143088a276c"} Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.436071 4744 scope.go:117] "RemoveContainer" containerID="d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.435730 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f698f5f-5sf7b" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.441752 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" event={"ID":"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24","Type":"ContainerStarted","Data":"1a5b0273748bd5eaafaace5f1448a1d0f51d979c5bb0dd338f4fb22e28db547a"} Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.442069 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.467599 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" podStartSLOduration=2.467578247 podStartE2EDuration="2.467578247s" podCreationTimestamp="2026-03-11 02:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:29.463292874 +0000 UTC m=+5486.267510499" watchObservedRunningTime="2026-03-11 02:25:29.467578247 +0000 UTC m=+5486.271795852" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.474504 4744 scope.go:117] "RemoveContainer" containerID="8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.502927 4744 scope.go:117] "RemoveContainer" containerID="d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.503034 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:29 crc kubenswrapper[4744]: E0311 02:25:29.503595 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103\": container with ID starting with d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103 not found: ID does not exist" containerID="d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.503653 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103"} err="failed to get container status \"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103\": rpc error: code = NotFound desc = could not find container \"d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103\": container with ID starting with d02e379823697a32e4e0524a3158f941ac5d8f8cb239bb7d897a67d9a5175103 not found: ID does not exist" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.503686 4744 scope.go:117] "RemoveContainer" containerID="8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe" Mar 11 02:25:29 crc kubenswrapper[4744]: E0311 02:25:29.504115 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe\": container with ID starting with 8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe not found: ID does not exist" containerID="8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.504140 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe"} err="failed to get container status \"8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe\": rpc error: code = NotFound desc = could not find container \"8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe\": container with ID starting with 8e102e2161ec3033ce818b4c9a707b951130b1955362ce4eccd293eece9e7dfe not found: ID does not exist" Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.517059 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f698f5f-5sf7b"] Mar 11 02:25:29 crc kubenswrapper[4744]: I0311 02:25:29.996139 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" path="/var/lib/kubelet/pods/7c6cd15f-e462-4531-8034-5a4da5b67cb0/volumes" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.356811 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 11 02:25:30 crc kubenswrapper[4744]: E0311 02:25:30.357704 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="init" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.357882 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="init" Mar 11 02:25:30 crc kubenswrapper[4744]: E0311 02:25:30.358065 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="dnsmasq-dns" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.358201 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="dnsmasq-dns" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.358654 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c6cd15f-e462-4531-8034-5a4da5b67cb0" containerName="dnsmasq-dns" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.359674 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.361981 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.365578 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.465068 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4cz\" (UniqueName: \"kubernetes.io/projected/deb2a1ca-b54e-4439-8889-67b4e9407b3b-kube-api-access-nw4cz\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.465374 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/deb2a1ca-b54e-4439-8889-67b4e9407b3b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.465461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0785399a-19c7-46a6-853c-69ec31602521\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0785399a-19c7-46a6-853c-69ec31602521\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.567691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0785399a-19c7-46a6-853c-69ec31602521\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0785399a-19c7-46a6-853c-69ec31602521\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.567867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4cz\" (UniqueName: \"kubernetes.io/projected/deb2a1ca-b54e-4439-8889-67b4e9407b3b-kube-api-access-nw4cz\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.568026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/deb2a1ca-b54e-4439-8889-67b4e9407b3b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.571896 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.571958 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0785399a-19c7-46a6-853c-69ec31602521\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0785399a-19c7-46a6-853c-69ec31602521\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d5014f9158958919f435494f533bce2253e91973b34d88eb53c363fbee19a185/globalmount\"" pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.574742 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/deb2a1ca-b54e-4439-8889-67b4e9407b3b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.592783 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4cz\" (UniqueName: \"kubernetes.io/projected/deb2a1ca-b54e-4439-8889-67b4e9407b3b-kube-api-access-nw4cz\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.642401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0785399a-19c7-46a6-853c-69ec31602521\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0785399a-19c7-46a6-853c-69ec31602521\") pod \"ovn-copy-data\" (UID: \"deb2a1ca-b54e-4439-8889-67b4e9407b3b\") " pod="openstack/ovn-copy-data" Mar 11 02:25:30 crc kubenswrapper[4744]: I0311 02:25:30.682480 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 11 02:25:31 crc kubenswrapper[4744]: I0311 02:25:31.349140 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 11 02:25:31 crc kubenswrapper[4744]: I0311 02:25:31.462848 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"deb2a1ca-b54e-4439-8889-67b4e9407b3b","Type":"ContainerStarted","Data":"6e615dba336620a052ef21d9d8cc1df20684581fac6fdb827ce8d78285a3c04e"} Mar 11 02:25:34 crc kubenswrapper[4744]: I0311 02:25:34.492483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"deb2a1ca-b54e-4439-8889-67b4e9407b3b","Type":"ContainerStarted","Data":"ec34852624dbc19ffdf66ad75490dae88d5c78e36613058b2985e9710d5e3116"} Mar 11 02:25:37 crc kubenswrapper[4744]: I0311 02:25:37.658770 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:25:37 crc kubenswrapper[4744]: I0311 02:25:37.692254 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=5.819128939 podStartE2EDuration="8.692235983s" podCreationTimestamp="2026-03-11 02:25:29 +0000 UTC" firstStartedPulling="2026-03-11 02:25:31.357755018 +0000 UTC m=+5488.161972653" lastFinishedPulling="2026-03-11 02:25:34.230862072 +0000 UTC m=+5491.035079697" observedRunningTime="2026-03-11 02:25:34.512185416 +0000 UTC m=+5491.316403421" watchObservedRunningTime="2026-03-11 02:25:37.692235983 +0000 UTC m=+5494.496453598" Mar 11 02:25:37 crc kubenswrapper[4744]: I0311 02:25:37.744391 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:25:37 crc kubenswrapper[4744]: I0311 02:25:37.744618 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="dnsmasq-dns" containerID="cri-o://a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf" gracePeriod=10 Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.209948 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.410143 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config\") pod \"a3b50521-42df-4429-92be-19652912970d\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.410322 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs42w\" (UniqueName: \"kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w\") pod \"a3b50521-42df-4429-92be-19652912970d\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.410422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc\") pod \"a3b50521-42df-4429-92be-19652912970d\" (UID: \"a3b50521-42df-4429-92be-19652912970d\") " Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.417540 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w" (OuterVolumeSpecName: "kube-api-access-cs42w") pod "a3b50521-42df-4429-92be-19652912970d" (UID: "a3b50521-42df-4429-92be-19652912970d"). InnerVolumeSpecName "kube-api-access-cs42w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.487941 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3b50521-42df-4429-92be-19652912970d" (UID: "a3b50521-42df-4429-92be-19652912970d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.489264 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config" (OuterVolumeSpecName: "config") pod "a3b50521-42df-4429-92be-19652912970d" (UID: "a3b50521-42df-4429-92be-19652912970d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.512425 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.512476 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs42w\" (UniqueName: \"kubernetes.io/projected/a3b50521-42df-4429-92be-19652912970d-kube-api-access-cs42w\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.512503 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3b50521-42df-4429-92be-19652912970d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.534699 4744 generic.go:334] "Generic (PLEG): container finished" podID="a3b50521-42df-4429-92be-19652912970d" containerID="a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf" exitCode=0 Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.534825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" event={"ID":"a3b50521-42df-4429-92be-19652912970d","Type":"ContainerDied","Data":"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf"} Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.535200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" event={"ID":"a3b50521-42df-4429-92be-19652912970d","Type":"ContainerDied","Data":"ef23250f4220ab251351d05a7ec82ada9f7c9cb05ff697bb646b2c5effa2b5df"} Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.535241 4744 scope.go:117] "RemoveContainer" containerID="a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.534860 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-2x58n" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.576203 4744 scope.go:117] "RemoveContainer" containerID="df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.590935 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.600814 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-2x58n"] Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.614328 4744 scope.go:117] "RemoveContainer" containerID="a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf" Mar 11 02:25:38 crc kubenswrapper[4744]: E0311 02:25:38.620111 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf\": container with ID starting with a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf not found: ID does not exist" containerID="a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.620152 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf"} err="failed to get container status \"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf\": rpc error: code = NotFound desc = could not find container \"a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf\": container with ID starting with a0c4e32c885eb269b89f304c93014e687e7f71020851ffca5cd6a7138006e5cf not found: ID does not exist" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.620177 4744 scope.go:117] "RemoveContainer" containerID="df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580" Mar 11 02:25:38 crc kubenswrapper[4744]: E0311 02:25:38.620723 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580\": container with ID starting with df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580 not found: ID does not exist" containerID="df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580" Mar 11 02:25:38 crc kubenswrapper[4744]: I0311 02:25:38.620781 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580"} err="failed to get container status \"df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580\": rpc error: code = NotFound desc = could not find container \"df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580\": container with ID starting with df09e6675842db56c4e7ad994d9d23095767345951713f6c8f7d2378173b6580 not found: ID does not exist" Mar 11 02:25:39 crc kubenswrapper[4744]: I0311 02:25:39.982814 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b50521-42df-4429-92be-19652912970d" path="/var/lib/kubelet/pods/a3b50521-42df-4429-92be-19652912970d/volumes" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.033635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 11 02:25:40 crc kubenswrapper[4744]: E0311 02:25:40.034038 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="dnsmasq-dns" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.034060 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="dnsmasq-dns" Mar 11 02:25:40 crc kubenswrapper[4744]: E0311 02:25:40.034081 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="init" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.034088 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="init" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.037687 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b50521-42df-4429-92be-19652912970d" containerName="dnsmasq-dns" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.038885 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.041795 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.042046 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.042178 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.042586 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rntmx" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.056788 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.140684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-config\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141560 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141683 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-scripts\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141731 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xhk\" (UniqueName: \"kubernetes.io/projected/11bc054e-dfda-4532-85a0-a74a6afd5e4e-kube-api-access-h5xhk\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141833 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141851 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.141898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243457 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-scripts\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243744 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xhk\" (UniqueName: \"kubernetes.io/projected/11bc054e-dfda-4532-85a0-a74a6afd5e4e-kube-api-access-h5xhk\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243780 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243821 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.243848 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-config\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.244578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-config\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.244845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.245947 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11bc054e-dfda-4532-85a0-a74a6afd5e4e-scripts\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.249856 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.251114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.251310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bc054e-dfda-4532-85a0-a74a6afd5e4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.266501 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xhk\" (UniqueName: \"kubernetes.io/projected/11bc054e-dfda-4532-85a0-a74a6afd5e4e-kube-api-access-h5xhk\") pod \"ovn-northd-0\" (UID: \"11bc054e-dfda-4532-85a0-a74a6afd5e4e\") " pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.358168 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 02:25:40 crc kubenswrapper[4744]: I0311 02:25:40.849387 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 02:25:41 crc kubenswrapper[4744]: I0311 02:25:41.599509 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"11bc054e-dfda-4532-85a0-a74a6afd5e4e","Type":"ContainerStarted","Data":"9843903ebf09c227db4f29acadd185f8878f3451109900b48faa97f20751cad9"} Mar 11 02:25:41 crc kubenswrapper[4744]: I0311 02:25:41.599835 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"11bc054e-dfda-4532-85a0-a74a6afd5e4e","Type":"ContainerStarted","Data":"29815c70edb362be0f84fcec0fc6d5817333f5c31033344d12ef2b4c3672e9d0"} Mar 11 02:25:41 crc kubenswrapper[4744]: I0311 02:25:41.599848 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"11bc054e-dfda-4532-85a0-a74a6afd5e4e","Type":"ContainerStarted","Data":"24bee9f5aed19e832728d55885d941c7dc8543eb6391ad5d806e13c1985413f0"} Mar 11 02:25:41 crc kubenswrapper[4744]: I0311 02:25:41.599962 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 11 02:25:41 crc kubenswrapper[4744]: I0311 02:25:41.634814 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.634786943 podStartE2EDuration="1.634786943s" podCreationTimestamp="2026-03-11 02:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:41.62430448 +0000 UTC m=+5498.428522115" watchObservedRunningTime="2026-03-11 02:25:41.634786943 +0000 UTC m=+5498.439004588" Mar 11 02:25:42 crc kubenswrapper[4744]: I0311 02:25:42.409630 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:25:42 crc kubenswrapper[4744]: I0311 02:25:42.409738 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.277249 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9v4nw"] Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.279007 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.283768 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9v4nw"] Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.446384 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.446559 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc79t\" (UniqueName: \"kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.481561 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-505a-account-create-update-dqmkk"] Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.482480 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.484350 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.488453 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-505a-account-create-update-dqmkk"] Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.548380 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc79t\" (UniqueName: \"kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.548566 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.549355 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.573441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc79t\" (UniqueName: \"kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t\") pod \"keystone-db-create-9v4nw\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.634963 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.650127 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.650192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mdr\" (UniqueName: \"kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.763184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.762043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.763691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mdr\" (UniqueName: \"kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.784985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mdr\" (UniqueName: \"kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr\") pod \"keystone-505a-account-create-update-dqmkk\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:45 crc kubenswrapper[4744]: I0311 02:25:45.800066 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:45.920765 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9v4nw"] Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.286969 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.294901 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.314289 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.373932 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.374079 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtctj\" (UniqueName: \"kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.374115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.476427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtctj\" (UniqueName: \"kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.476542 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.476759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.477190 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.477190 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.499338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtctj\" (UniqueName: \"kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj\") pod \"community-operators-wc86p\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.643644 4744 generic.go:334] "Generic (PLEG): container finished" podID="f5f595b1-576e-4f29-8a3f-5122b93accc9" containerID="41d0437c40e87b1ae6feeddb4b53a62fdab0700f2af5b394699655107306d7cb" exitCode=0 Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.643690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9v4nw" event={"ID":"f5f595b1-576e-4f29-8a3f-5122b93accc9","Type":"ContainerDied","Data":"41d0437c40e87b1ae6feeddb4b53a62fdab0700f2af5b394699655107306d7cb"} Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.643719 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9v4nw" event={"ID":"f5f595b1-576e-4f29-8a3f-5122b93accc9","Type":"ContainerStarted","Data":"84ca6163895cb82d7f7a3013fbb1a6b6daa72e498acd379f46613ea06fd978bb"} Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.662942 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:46 crc kubenswrapper[4744]: I0311 02:25:46.854589 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-505a-account-create-update-dqmkk"] Mar 11 02:25:47 crc kubenswrapper[4744]: W0311 02:25:47.203735 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5640f82a_017e_4fdf_b2a9_bd27595857ac.slice/crio-cd7c754f100c21deb4643c8fa076f0e0ca03290c7c07bb5aa674d811b2d77676 WatchSource:0}: Error finding container cd7c754f100c21deb4643c8fa076f0e0ca03290c7c07bb5aa674d811b2d77676: Status 404 returned error can't find the container with id cd7c754f100c21deb4643c8fa076f0e0ca03290c7c07bb5aa674d811b2d77676 Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.206505 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.653716 4744 generic.go:334] "Generic (PLEG): container finished" podID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerID="bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793" exitCode=0 Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.653794 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerDied","Data":"bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793"} Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.653865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerStarted","Data":"cd7c754f100c21deb4643c8fa076f0e0ca03290c7c07bb5aa674d811b2d77676"} Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.658304 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-505a-account-create-update-dqmkk" event={"ID":"b8540b4c-50af-4bd6-abba-8e9e820cf22c","Type":"ContainerStarted","Data":"1d75cc4fc5140c5d8674c518d3ebefbaeb41513facbec9f5626527ce0c40a28e"} Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.658346 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-505a-account-create-update-dqmkk" event={"ID":"b8540b4c-50af-4bd6-abba-8e9e820cf22c","Type":"ContainerStarted","Data":"be3f55a16d82397942373c7c12f45d28fd3cfb5140beddd116d0788c3612a73f"} Mar 11 02:25:47 crc kubenswrapper[4744]: I0311 02:25:47.708922 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-505a-account-create-update-dqmkk" podStartSLOduration=2.708897129 podStartE2EDuration="2.708897129s" podCreationTimestamp="2026-03-11 02:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:47.702729629 +0000 UTC m=+5504.506947254" watchObservedRunningTime="2026-03-11 02:25:47.708897129 +0000 UTC m=+5504.513114764" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.093267 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.108193 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts\") pod \"f5f595b1-576e-4f29-8a3f-5122b93accc9\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.108356 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc79t\" (UniqueName: \"kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t\") pod \"f5f595b1-576e-4f29-8a3f-5122b93accc9\" (UID: \"f5f595b1-576e-4f29-8a3f-5122b93accc9\") " Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.109124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5f595b1-576e-4f29-8a3f-5122b93accc9" (UID: "f5f595b1-576e-4f29-8a3f-5122b93accc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.113782 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t" (OuterVolumeSpecName: "kube-api-access-zc79t") pod "f5f595b1-576e-4f29-8a3f-5122b93accc9" (UID: "f5f595b1-576e-4f29-8a3f-5122b93accc9"). InnerVolumeSpecName "kube-api-access-zc79t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.209351 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f595b1-576e-4f29-8a3f-5122b93accc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.209390 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc79t\" (UniqueName: \"kubernetes.io/projected/f5f595b1-576e-4f29-8a3f-5122b93accc9-kube-api-access-zc79t\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.676849 4744 generic.go:334] "Generic (PLEG): container finished" podID="b8540b4c-50af-4bd6-abba-8e9e820cf22c" containerID="1d75cc4fc5140c5d8674c518d3ebefbaeb41513facbec9f5626527ce0c40a28e" exitCode=0 Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.678100 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-505a-account-create-update-dqmkk" event={"ID":"b8540b4c-50af-4bd6-abba-8e9e820cf22c","Type":"ContainerDied","Data":"1d75cc4fc5140c5d8674c518d3ebefbaeb41513facbec9f5626527ce0c40a28e"} Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.682423 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9v4nw" event={"ID":"f5f595b1-576e-4f29-8a3f-5122b93accc9","Type":"ContainerDied","Data":"84ca6163895cb82d7f7a3013fbb1a6b6daa72e498acd379f46613ea06fd978bb"} Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.682487 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ca6163895cb82d7f7a3013fbb1a6b6daa72e498acd379f46613ea06fd978bb" Mar 11 02:25:48 crc kubenswrapper[4744]: I0311 02:25:48.682623 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9v4nw" Mar 11 02:25:49 crc kubenswrapper[4744]: I0311 02:25:49.697557 4744 generic.go:334] "Generic (PLEG): container finished" podID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerID="09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f" exitCode=0 Mar 11 02:25:49 crc kubenswrapper[4744]: I0311 02:25:49.697706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerDied","Data":"09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f"} Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.138381 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.245146 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mdr\" (UniqueName: \"kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr\") pod \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.245338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts\") pod \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\" (UID: \"b8540b4c-50af-4bd6-abba-8e9e820cf22c\") " Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.246370 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8540b4c-50af-4bd6-abba-8e9e820cf22c" (UID: "b8540b4c-50af-4bd6-abba-8e9e820cf22c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.254489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr" (OuterVolumeSpecName: "kube-api-access-r7mdr") pod "b8540b4c-50af-4bd6-abba-8e9e820cf22c" (UID: "b8540b4c-50af-4bd6-abba-8e9e820cf22c"). InnerVolumeSpecName "kube-api-access-r7mdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.346910 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mdr\" (UniqueName: \"kubernetes.io/projected/b8540b4c-50af-4bd6-abba-8e9e820cf22c-kube-api-access-r7mdr\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.347206 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8540b4c-50af-4bd6-abba-8e9e820cf22c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.710040 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-505a-account-create-update-dqmkk" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.710103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-505a-account-create-update-dqmkk" event={"ID":"b8540b4c-50af-4bd6-abba-8e9e820cf22c","Type":"ContainerDied","Data":"be3f55a16d82397942373c7c12f45d28fd3cfb5140beddd116d0788c3612a73f"} Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.710226 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be3f55a16d82397942373c7c12f45d28fd3cfb5140beddd116d0788c3612a73f" Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.713086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerStarted","Data":"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40"} Mar 11 02:25:50 crc kubenswrapper[4744]: I0311 02:25:50.743227 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wc86p" podStartSLOduration=1.981516456 podStartE2EDuration="4.743207788s" podCreationTimestamp="2026-03-11 02:25:46 +0000 UTC" firstStartedPulling="2026-03-11 02:25:47.656467405 +0000 UTC m=+5504.460685020" lastFinishedPulling="2026-03-11 02:25:50.418158707 +0000 UTC m=+5507.222376352" observedRunningTime="2026-03-11 02:25:50.735279274 +0000 UTC m=+5507.539496889" watchObservedRunningTime="2026-03-11 02:25:50.743207788 +0000 UTC m=+5507.547425403" Mar 11 02:25:55 crc kubenswrapper[4744]: I0311 02:25:55.998889 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-69x27"] Mar 11 02:25:56 crc kubenswrapper[4744]: E0311 02:25:55.999830 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f595b1-576e-4f29-8a3f-5122b93accc9" containerName="mariadb-database-create" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:55.999847 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f595b1-576e-4f29-8a3f-5122b93accc9" containerName="mariadb-database-create" Mar 11 02:25:56 crc kubenswrapper[4744]: E0311 02:25:55.999860 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8540b4c-50af-4bd6-abba-8e9e820cf22c" containerName="mariadb-account-create-update" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:55.999869 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8540b4c-50af-4bd6-abba-8e9e820cf22c" containerName="mariadb-account-create-update" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.000066 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f595b1-576e-4f29-8a3f-5122b93accc9" containerName="mariadb-database-create" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.000086 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8540b4c-50af-4bd6-abba-8e9e820cf22c" containerName="mariadb-account-create-update" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.000701 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.008735 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.008823 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.010927 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.011954 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkpp9" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.026351 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-69x27"] Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.155033 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.155381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5p8\" (UniqueName: \"kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.155625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.257449 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.257563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5p8\" (UniqueName: \"kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.257705 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.266551 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.270440 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.295423 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5p8\" (UniqueName: \"kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8\") pod \"keystone-db-sync-69x27\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.322174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-69x27" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.663873 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.664237 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.707570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.872548 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.892261 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-69x27"] Mar 11 02:25:56 crc kubenswrapper[4744]: W0311 02:25:56.895014 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefe36576_8433_45b2_a376_82d5439a0208.slice/crio-669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613 WatchSource:0}: Error finding container 669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613: Status 404 returned error can't find the container with id 669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613 Mar 11 02:25:56 crc kubenswrapper[4744]: I0311 02:25:56.955182 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:57 crc kubenswrapper[4744]: I0311 02:25:57.786615 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-69x27" event={"ID":"efe36576-8433-45b2-a376-82d5439a0208","Type":"ContainerStarted","Data":"1a359cfcbdd2e0ac830ef649745ebc3c5a8e9453572a20dd943f035fbacac5e6"} Mar 11 02:25:57 crc kubenswrapper[4744]: I0311 02:25:57.787116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-69x27" event={"ID":"efe36576-8433-45b2-a376-82d5439a0208","Type":"ContainerStarted","Data":"669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613"} Mar 11 02:25:57 crc kubenswrapper[4744]: I0311 02:25:57.822052 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-69x27" podStartSLOduration=2.8220277769999997 podStartE2EDuration="2.822027777s" podCreationTimestamp="2026-03-11 02:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:25:57.818867829 +0000 UTC m=+5514.623085474" watchObservedRunningTime="2026-03-11 02:25:57.822027777 +0000 UTC m=+5514.626245422" Mar 11 02:25:58 crc kubenswrapper[4744]: I0311 02:25:58.797928 4744 generic.go:334] "Generic (PLEG): container finished" podID="efe36576-8433-45b2-a376-82d5439a0208" containerID="1a359cfcbdd2e0ac830ef649745ebc3c5a8e9453572a20dd943f035fbacac5e6" exitCode=0 Mar 11 02:25:58 crc kubenswrapper[4744]: I0311 02:25:58.798030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-69x27" event={"ID":"efe36576-8433-45b2-a376-82d5439a0208","Type":"ContainerDied","Data":"1a359cfcbdd2e0ac830ef649745ebc3c5a8e9453572a20dd943f035fbacac5e6"} Mar 11 02:25:58 crc kubenswrapper[4744]: I0311 02:25:58.798580 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wc86p" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="registry-server" containerID="cri-o://b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40" gracePeriod=2 Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.351649 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.540326 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content\") pod \"5640f82a-017e-4fdf-b2a9-bd27595857ac\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.540483 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtctj\" (UniqueName: \"kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj\") pod \"5640f82a-017e-4fdf-b2a9-bd27595857ac\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.540533 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities\") pod \"5640f82a-017e-4fdf-b2a9-bd27595857ac\" (UID: \"5640f82a-017e-4fdf-b2a9-bd27595857ac\") " Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.542223 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities" (OuterVolumeSpecName: "utilities") pod "5640f82a-017e-4fdf-b2a9-bd27595857ac" (UID: "5640f82a-017e-4fdf-b2a9-bd27595857ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.558375 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj" (OuterVolumeSpecName: "kube-api-access-wtctj") pod "5640f82a-017e-4fdf-b2a9-bd27595857ac" (UID: "5640f82a-017e-4fdf-b2a9-bd27595857ac"). InnerVolumeSpecName "kube-api-access-wtctj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.616862 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5640f82a-017e-4fdf-b2a9-bd27595857ac" (UID: "5640f82a-017e-4fdf-b2a9-bd27595857ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.642923 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.642952 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtctj\" (UniqueName: \"kubernetes.io/projected/5640f82a-017e-4fdf-b2a9-bd27595857ac-kube-api-access-wtctj\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.642968 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5640f82a-017e-4fdf-b2a9-bd27595857ac-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.815866 4744 generic.go:334] "Generic (PLEG): container finished" podID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerID="b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40" exitCode=0 Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.815946 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc86p" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.816013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerDied","Data":"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40"} Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.816090 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc86p" event={"ID":"5640f82a-017e-4fdf-b2a9-bd27595857ac","Type":"ContainerDied","Data":"cd7c754f100c21deb4643c8fa076f0e0ca03290c7c07bb5aa674d811b2d77676"} Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.816131 4744 scope.go:117] "RemoveContainer" containerID="b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.864945 4744 scope.go:117] "RemoveContainer" containerID="09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.878793 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.890284 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wc86p"] Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.911049 4744 scope.go:117] "RemoveContainer" containerID="bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.954540 4744 scope.go:117] "RemoveContainer" containerID="b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40" Mar 11 02:25:59 crc kubenswrapper[4744]: E0311 02:25:59.955155 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40\": container with ID starting with b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40 not found: ID does not exist" containerID="b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.955223 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40"} err="failed to get container status \"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40\": rpc error: code = NotFound desc = could not find container \"b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40\": container with ID starting with b698ec194e35afa0dc092f33984327079f9539c6c4f9a6f35bb459f9c5ea2b40 not found: ID does not exist" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.955276 4744 scope.go:117] "RemoveContainer" containerID="09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f" Mar 11 02:25:59 crc kubenswrapper[4744]: E0311 02:25:59.955730 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f\": container with ID starting with 09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f not found: ID does not exist" containerID="09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.955777 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f"} err="failed to get container status \"09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f\": rpc error: code = NotFound desc = could not find container \"09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f\": container with ID starting with 09395f279e733b9d94a547e034b81ac3d9e82520a702ee7f781065ca60d8e56f not found: ID does not exist" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.955815 4744 scope.go:117] "RemoveContainer" containerID="bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793" Mar 11 02:25:59 crc kubenswrapper[4744]: E0311 02:25:59.956577 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793\": container with ID starting with bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793 not found: ID does not exist" containerID="bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.956639 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793"} err="failed to get container status \"bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793\": rpc error: code = NotFound desc = could not find container \"bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793\": container with ID starting with bd09dd1984db1ee52c5c604b4db2625f5da3388ece73dbcb548bc83fb4c22793 not found: ID does not exist" Mar 11 02:25:59 crc kubenswrapper[4744]: I0311 02:25:59.998054 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" path="/var/lib/kubelet/pods/5640f82a-017e-4fdf-b2a9-bd27595857ac/volumes" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.136891 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553266-2p9pn"] Mar 11 02:26:00 crc kubenswrapper[4744]: E0311 02:26:00.138112 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="extract-utilities" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.138161 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="extract-utilities" Mar 11 02:26:00 crc kubenswrapper[4744]: E0311 02:26:00.138199 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="registry-server" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.138230 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="registry-server" Mar 11 02:26:00 crc kubenswrapper[4744]: E0311 02:26:00.138274 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="extract-content" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.138291 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="extract-content" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.138691 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5640f82a-017e-4fdf-b2a9-bd27595857ac" containerName="registry-server" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.139604 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.142315 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.142580 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.142873 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.149948 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553266-2p9pn"] Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.224685 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-69x27" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.255337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5q6v\" (UniqueName: \"kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v\") pod \"auto-csr-approver-29553266-2p9pn\" (UID: \"727a4f4b-1972-442d-a7e0-35607159db4a\") " pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.357320 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data\") pod \"efe36576-8433-45b2-a376-82d5439a0208\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.357876 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk5p8\" (UniqueName: \"kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8\") pod \"efe36576-8433-45b2-a376-82d5439a0208\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.358103 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle\") pod \"efe36576-8433-45b2-a376-82d5439a0208\" (UID: \"efe36576-8433-45b2-a376-82d5439a0208\") " Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.358985 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5q6v\" (UniqueName: \"kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v\") pod \"auto-csr-approver-29553266-2p9pn\" (UID: \"727a4f4b-1972-442d-a7e0-35607159db4a\") " pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.391470 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5q6v\" (UniqueName: \"kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v\") pod \"auto-csr-approver-29553266-2p9pn\" (UID: \"727a4f4b-1972-442d-a7e0-35607159db4a\") " pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.392444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8" (OuterVolumeSpecName: "kube-api-access-sk5p8") pod "efe36576-8433-45b2-a376-82d5439a0208" (UID: "efe36576-8433-45b2-a376-82d5439a0208"). InnerVolumeSpecName "kube-api-access-sk5p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.397160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efe36576-8433-45b2-a376-82d5439a0208" (UID: "efe36576-8433-45b2-a376-82d5439a0208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.437038 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data" (OuterVolumeSpecName: "config-data") pod "efe36576-8433-45b2-a376-82d5439a0208" (UID: "efe36576-8433-45b2-a376-82d5439a0208"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.447863 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.461195 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.461231 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk5p8\" (UniqueName: \"kubernetes.io/projected/efe36576-8433-45b2-a376-82d5439a0208-kube-api-access-sk5p8\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.461245 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe36576-8433-45b2-a376-82d5439a0208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.517231 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.828235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-69x27" event={"ID":"efe36576-8433-45b2-a376-82d5439a0208","Type":"ContainerDied","Data":"669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613"} Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.828596 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669eb1da906e7db1d154ed18610b9ebc84a8dc9a5fa240c5529841abfceec613" Mar 11 02:26:00 crc kubenswrapper[4744]: I0311 02:26:00.828254 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-69x27" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.040746 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553266-2p9pn"] Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.089688 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59b6c5dc-qrpfm"] Mar 11 02:26:01 crc kubenswrapper[4744]: E0311 02:26:01.090465 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe36576-8433-45b2-a376-82d5439a0208" containerName="keystone-db-sync" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.090482 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe36576-8433-45b2-a376-82d5439a0208" containerName="keystone-db-sync" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.090655 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe36576-8433-45b2-a376-82d5439a0208" containerName="keystone-db-sync" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.091426 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.113982 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b6c5dc-qrpfm"] Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.121968 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cbkwb"] Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.124034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.126739 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkpp9" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.126935 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.127202 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.127346 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.127444 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.133262 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cbkwb"] Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280504 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-config\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280630 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280659 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-nb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-dns-svc\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280750 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280778 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzcxk\" (UniqueName: \"kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-sb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhpx\" (UniqueName: \"kubernetes.io/projected/0818d695-0611-4e9d-b2d4-4894bec77500-kube-api-access-8nhpx\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.280908 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382158 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzcxk\" (UniqueName: \"kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-sb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhpx\" (UniqueName: \"kubernetes.io/projected/0818d695-0611-4e9d-b2d4-4894bec77500-kube-api-access-8nhpx\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382267 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382294 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382310 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-config\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382348 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382368 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-nb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.382402 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-dns-svc\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.383199 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-dns-svc\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.384695 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-config\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.384806 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-sb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.385053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0818d695-0611-4e9d-b2d4-4894bec77500-ovsdbserver-nb\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.388913 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.389130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.389228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.391269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.397639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.402396 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzcxk\" (UniqueName: \"kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk\") pod \"keystone-bootstrap-cbkwb\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.402449 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhpx\" (UniqueName: \"kubernetes.io/projected/0818d695-0611-4e9d-b2d4-4894bec77500-kube-api-access-8nhpx\") pod \"dnsmasq-dns-59b6c5dc-qrpfm\" (UID: \"0818d695-0611-4e9d-b2d4-4894bec77500\") " pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.414992 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.442524 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.842790 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" event={"ID":"727a4f4b-1972-442d-a7e0-35607159db4a","Type":"ContainerStarted","Data":"a803906741a16df11aadd51471bf208741a1151d721bfb7e9fc5bb252210aa35"} Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.936761 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b6c5dc-qrpfm"] Mar 11 02:26:01 crc kubenswrapper[4744]: W0311 02:26:01.944529 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0818d695_0611_4e9d_b2d4_4894bec77500.slice/crio-12f9617c6e6e915245f3c9c2d1cd2214d14f7d82c03fd81f3ce9d556baf6ff7c WatchSource:0}: Error finding container 12f9617c6e6e915245f3c9c2d1cd2214d14f7d82c03fd81f3ce9d556baf6ff7c: Status 404 returned error can't find the container with id 12f9617c6e6e915245f3c9c2d1cd2214d14f7d82c03fd81f3ce9d556baf6ff7c Mar 11 02:26:01 crc kubenswrapper[4744]: W0311 02:26:01.946766 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35fed1b9_7133_4072_b511_c4a19261f507.slice/crio-9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97 WatchSource:0}: Error finding container 9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97: Status 404 returned error can't find the container with id 9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97 Mar 11 02:26:01 crc kubenswrapper[4744]: I0311 02:26:01.961328 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cbkwb"] Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.353590 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.355296 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.368019 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.514154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.514784 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2sw\" (UniqueName: \"kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.514986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.616053 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.616113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.616167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2sw\" (UniqueName: \"kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.616450 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.616681 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.634712 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2sw\" (UniqueName: \"kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw\") pod \"redhat-operators-bknbb\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.672093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.857777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cbkwb" event={"ID":"35fed1b9-7133-4072-b511-c4a19261f507","Type":"ContainerStarted","Data":"a662d89bda27c07cba467b0d3fbbd711e592b119debb46681ce2dd0c7b97beac"} Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.857818 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cbkwb" event={"ID":"35fed1b9-7133-4072-b511-c4a19261f507","Type":"ContainerStarted","Data":"9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97"} Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.862744 4744 generic.go:334] "Generic (PLEG): container finished" podID="0818d695-0611-4e9d-b2d4-4894bec77500" containerID="18bd2628c4584e7d6abbb3036f6a311757be83bf551d2a9823c7521df5cd7f6d" exitCode=0 Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.862825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" event={"ID":"0818d695-0611-4e9d-b2d4-4894bec77500","Type":"ContainerDied","Data":"18bd2628c4584e7d6abbb3036f6a311757be83bf551d2a9823c7521df5cd7f6d"} Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.862847 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" event={"ID":"0818d695-0611-4e9d-b2d4-4894bec77500","Type":"ContainerStarted","Data":"12f9617c6e6e915245f3c9c2d1cd2214d14f7d82c03fd81f3ce9d556baf6ff7c"} Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.867291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" event={"ID":"727a4f4b-1972-442d-a7e0-35607159db4a","Type":"ContainerStarted","Data":"d731f0365636e990ae07735281ccf07e9e3e9faffa874566428948e60f8fdcf5"} Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.888000 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cbkwb" podStartSLOduration=1.887974804 podStartE2EDuration="1.887974804s" podCreationTimestamp="2026-03-11 02:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:26:02.887910712 +0000 UTC m=+5519.692128317" watchObservedRunningTime="2026-03-11 02:26:02.887974804 +0000 UTC m=+5519.692192409" Mar 11 02:26:02 crc kubenswrapper[4744]: I0311 02:26:02.950599 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" podStartSLOduration=1.602717172 podStartE2EDuration="2.95051028s" podCreationTimestamp="2026-03-11 02:26:00 +0000 UTC" firstStartedPulling="2026-03-11 02:26:01.060838373 +0000 UTC m=+5517.865055978" lastFinishedPulling="2026-03-11 02:26:02.408631481 +0000 UTC m=+5519.212849086" observedRunningTime="2026-03-11 02:26:02.92098523 +0000 UTC m=+5519.725202835" watchObservedRunningTime="2026-03-11 02:26:02.95051028 +0000 UTC m=+5519.754727885" Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.124484 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:03 crc kubenswrapper[4744]: W0311 02:26:03.130377 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda90a95c_3594_4021_9a54_6efc0fcd1d10.slice/crio-f47ef7472471a84aa8ffc47f10fffdf1185f3046ee6f45047a109e6f2e9ea556 WatchSource:0}: Error finding container f47ef7472471a84aa8ffc47f10fffdf1185f3046ee6f45047a109e6f2e9ea556: Status 404 returned error can't find the container with id f47ef7472471a84aa8ffc47f10fffdf1185f3046ee6f45047a109e6f2e9ea556 Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.875683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" event={"ID":"0818d695-0611-4e9d-b2d4-4894bec77500","Type":"ContainerStarted","Data":"cd933606e4dd92f4a349eaf4232b33fb467345779ce2c4de5b0339cd4c434c39"} Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.875992 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.877169 4744 generic.go:334] "Generic (PLEG): container finished" podID="727a4f4b-1972-442d-a7e0-35607159db4a" containerID="d731f0365636e990ae07735281ccf07e9e3e9faffa874566428948e60f8fdcf5" exitCode=0 Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.877245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" event={"ID":"727a4f4b-1972-442d-a7e0-35607159db4a","Type":"ContainerDied","Data":"d731f0365636e990ae07735281ccf07e9e3e9faffa874566428948e60f8fdcf5"} Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.878531 4744 generic.go:334] "Generic (PLEG): container finished" podID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerID="b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750" exitCode=0 Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.878595 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerDied","Data":"b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750"} Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.878677 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerStarted","Data":"f47ef7472471a84aa8ffc47f10fffdf1185f3046ee6f45047a109e6f2e9ea556"} Mar 11 02:26:03 crc kubenswrapper[4744]: I0311 02:26:03.903815 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" podStartSLOduration=2.903799108 podStartE2EDuration="2.903799108s" podCreationTimestamp="2026-03-11 02:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:26:03.897808573 +0000 UTC m=+5520.702026178" watchObservedRunningTime="2026-03-11 02:26:03.903799108 +0000 UTC m=+5520.708016703" Mar 11 02:26:04 crc kubenswrapper[4744]: I0311 02:26:04.891108 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerStarted","Data":"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5"} Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.250798 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.282882 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5q6v\" (UniqueName: \"kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v\") pod \"727a4f4b-1972-442d-a7e0-35607159db4a\" (UID: \"727a4f4b-1972-442d-a7e0-35607159db4a\") " Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.293698 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v" (OuterVolumeSpecName: "kube-api-access-w5q6v") pod "727a4f4b-1972-442d-a7e0-35607159db4a" (UID: "727a4f4b-1972-442d-a7e0-35607159db4a"). InnerVolumeSpecName "kube-api-access-w5q6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.384782 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5q6v\" (UniqueName: \"kubernetes.io/projected/727a4f4b-1972-442d-a7e0-35607159db4a-kube-api-access-w5q6v\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.906620 4744 generic.go:334] "Generic (PLEG): container finished" podID="35fed1b9-7133-4072-b511-c4a19261f507" containerID="a662d89bda27c07cba467b0d3fbbd711e592b119debb46681ce2dd0c7b97beac" exitCode=0 Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.906737 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cbkwb" event={"ID":"35fed1b9-7133-4072-b511-c4a19261f507","Type":"ContainerDied","Data":"a662d89bda27c07cba467b0d3fbbd711e592b119debb46681ce2dd0c7b97beac"} Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.911198 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.911195 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553266-2p9pn" event={"ID":"727a4f4b-1972-442d-a7e0-35607159db4a","Type":"ContainerDied","Data":"a803906741a16df11aadd51471bf208741a1151d721bfb7e9fc5bb252210aa35"} Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.911610 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a803906741a16df11aadd51471bf208741a1151d721bfb7e9fc5bb252210aa35" Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.915766 4744 generic.go:334] "Generic (PLEG): container finished" podID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerID="7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5" exitCode=0 Mar 11 02:26:05 crc kubenswrapper[4744]: I0311 02:26:05.915808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerDied","Data":"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5"} Mar 11 02:26:06 crc kubenswrapper[4744]: I0311 02:26:06.347563 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553260-5m7db"] Mar 11 02:26:06 crc kubenswrapper[4744]: I0311 02:26:06.359805 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553260-5m7db"] Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.364899 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425023 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzcxk\" (UniqueName: \"kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425086 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425123 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425178 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425250 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.425288 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys\") pod \"35fed1b9-7133-4072-b511-c4a19261f507\" (UID: \"35fed1b9-7133-4072-b511-c4a19261f507\") " Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.431108 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.431196 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.431261 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts" (OuterVolumeSpecName: "scripts") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.433072 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk" (OuterVolumeSpecName: "kube-api-access-xzcxk") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "kube-api-access-xzcxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.455234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.455861 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data" (OuterVolumeSpecName: "config-data") pod "35fed1b9-7133-4072-b511-c4a19261f507" (UID: "35fed1b9-7133-4072-b511-c4a19261f507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527447 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527506 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527550 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzcxk\" (UniqueName: \"kubernetes.io/projected/35fed1b9-7133-4072-b511-c4a19261f507-kube-api-access-xzcxk\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527569 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527587 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.527605 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fed1b9-7133-4072-b511-c4a19261f507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.949704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerStarted","Data":"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1"} Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.952068 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cbkwb" event={"ID":"35fed1b9-7133-4072-b511-c4a19261f507","Type":"ContainerDied","Data":"9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97"} Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.952298 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7c6df9d1defa05200f182e8385b19e9430b60553de5d122464ccbc77b2fe97" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.952115 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cbkwb" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.991371 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bknbb" podStartSLOduration=3.079170617 podStartE2EDuration="5.991345753s" podCreationTimestamp="2026-03-11 02:26:02 +0000 UTC" firstStartedPulling="2026-03-11 02:26:03.880183431 +0000 UTC m=+5520.684401026" lastFinishedPulling="2026-03-11 02:26:06.792358557 +0000 UTC m=+5523.596576162" observedRunningTime="2026-03-11 02:26:07.988089243 +0000 UTC m=+5524.792306868" watchObservedRunningTime="2026-03-11 02:26:07.991345753 +0000 UTC m=+5524.795563408" Mar 11 02:26:07 crc kubenswrapper[4744]: I0311 02:26:07.993885 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04ce991-8347-48a6-84a7-619f7406c110" path="/var/lib/kubelet/pods/f04ce991-8347-48a6-84a7-619f7406c110/volumes" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.107260 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cbkwb"] Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.120044 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cbkwb"] Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.222196 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2w2gq"] Mar 11 02:26:08 crc kubenswrapper[4744]: E0311 02:26:08.222552 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727a4f4b-1972-442d-a7e0-35607159db4a" containerName="oc" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.222571 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="727a4f4b-1972-442d-a7e0-35607159db4a" containerName="oc" Mar 11 02:26:08 crc kubenswrapper[4744]: E0311 02:26:08.222582 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fed1b9-7133-4072-b511-c4a19261f507" containerName="keystone-bootstrap" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.222605 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fed1b9-7133-4072-b511-c4a19261f507" containerName="keystone-bootstrap" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.222798 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fed1b9-7133-4072-b511-c4a19261f507" containerName="keystone-bootstrap" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.222829 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="727a4f4b-1972-442d-a7e0-35607159db4a" containerName="oc" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.223402 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.228641 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.230217 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.230216 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.230284 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.230292 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkpp9" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.240563 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmln6\" (UniqueName: \"kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.240644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.240720 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.240956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.241053 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.241106 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.248009 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2w2gq"] Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmln6\" (UniqueName: \"kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342737 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342917 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.342945 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.355564 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.355651 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.355846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.356394 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.356816 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.361727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmln6\" (UniqueName: \"kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6\") pod \"keystone-bootstrap-2w2gq\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:08 crc kubenswrapper[4744]: I0311 02:26:08.591600 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:09 crc kubenswrapper[4744]: I0311 02:26:09.035105 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2w2gq"] Mar 11 02:26:09 crc kubenswrapper[4744]: I0311 02:26:09.970982 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w2gq" event={"ID":"e325c578-6909-4a00-8a16-436f430a8071","Type":"ContainerStarted","Data":"664d51ea20ee21d04086ff13d1c2201d4a90c51e74b13ee455c5ba70d494c948"} Mar 11 02:26:09 crc kubenswrapper[4744]: I0311 02:26:09.971321 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w2gq" event={"ID":"e325c578-6909-4a00-8a16-436f430a8071","Type":"ContainerStarted","Data":"556a3ce628bc13a44203310e0cd21bcf55297b039ec24793c6a689a2fe5fb209"} Mar 11 02:26:10 crc kubenswrapper[4744]: I0311 02:26:10.015050 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2w2gq" podStartSLOduration=2.015022017 podStartE2EDuration="2.015022017s" podCreationTimestamp="2026-03-11 02:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:26:10.002854753 +0000 UTC m=+5526.807072388" watchObservedRunningTime="2026-03-11 02:26:10.015022017 +0000 UTC m=+5526.819239662" Mar 11 02:26:10 crc kubenswrapper[4744]: I0311 02:26:10.033075 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fed1b9-7133-4072-b511-c4a19261f507" path="/var/lib/kubelet/pods/35fed1b9-7133-4072-b511-c4a19261f507/volumes" Mar 11 02:26:11 crc kubenswrapper[4744]: I0311 02:26:11.417044 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59b6c5dc-qrpfm" Mar 11 02:26:11 crc kubenswrapper[4744]: I0311 02:26:11.493103 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:26:11 crc kubenswrapper[4744]: I0311 02:26:11.493375 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="dnsmasq-dns" containerID="cri-o://1a5b0273748bd5eaafaace5f1448a1d0f51d979c5bb0dd338f4fb22e28db547a" gracePeriod=10 Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.017726 4744 generic.go:334] "Generic (PLEG): container finished" podID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerID="1a5b0273748bd5eaafaace5f1448a1d0f51d979c5bb0dd338f4fb22e28db547a" exitCode=0 Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.018032 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" event={"ID":"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24","Type":"ContainerDied","Data":"1a5b0273748bd5eaafaace5f1448a1d0f51d979c5bb0dd338f4fb22e28db547a"} Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.018059 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" event={"ID":"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24","Type":"ContainerDied","Data":"7628c42b959d40b90261e261cd00b42518b2c4e61e0c7c19d7558bdff8214ac9"} Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.018069 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7628c42b959d40b90261e261cd00b42518b2c4e61e0c7c19d7558bdff8214ac9" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.018157 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.034127 4744 generic.go:334] "Generic (PLEG): container finished" podID="e325c578-6909-4a00-8a16-436f430a8071" containerID="664d51ea20ee21d04086ff13d1c2201d4a90c51e74b13ee455c5ba70d494c948" exitCode=0 Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.034172 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w2gq" event={"ID":"e325c578-6909-4a00-8a16-436f430a8071","Type":"ContainerDied","Data":"664d51ea20ee21d04086ff13d1c2201d4a90c51e74b13ee455c5ba70d494c948"} Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.113146 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb\") pod \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.113193 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb\") pod \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.113348 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config\") pod \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.113394 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc\") pod \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.113479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcwd\" (UniqueName: \"kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd\") pod \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\" (UID: \"4f1a10b7-2665-4a3b-a85d-f0e4831d8e24\") " Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.140715 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd" (OuterVolumeSpecName: "kube-api-access-rxcwd") pod "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" (UID: "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24"). InnerVolumeSpecName "kube-api-access-rxcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.160955 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" (UID: "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.170410 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config" (OuterVolumeSpecName: "config") pod "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" (UID: "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.177901 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" (UID: "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.215033 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-config\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.215235 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.215311 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcwd\" (UniqueName: \"kubernetes.io/projected/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-kube-api-access-rxcwd\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.215403 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.229884 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" (UID: "4f1a10b7-2665-4a3b-a85d-f0e4831d8e24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.316578 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.409578 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.409636 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.409680 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.410366 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.410419 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed" gracePeriod=600 Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.672449 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:12 crc kubenswrapper[4744]: I0311 02:26:12.672772 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.043734 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed" exitCode=0 Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.043817 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed"} Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.044073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690"} Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.044098 4744 scope.go:117] "RemoveContainer" containerID="6c2c2442692de599f07518611dfa7798a497d5f80b894d7b831bca9043e56ffc" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.044138 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c76c8f9c5-hc9kl" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.116139 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.119759 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c76c8f9c5-hc9kl"] Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.430849 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.534820 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.535215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.535363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.535671 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.535877 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmln6\" (UniqueName: \"kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.536012 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts\") pod \"e325c578-6909-4a00-8a16-436f430a8071\" (UID: \"e325c578-6909-4a00-8a16-436f430a8071\") " Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.540121 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.540246 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6" (OuterVolumeSpecName: "kube-api-access-bmln6") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "kube-api-access-bmln6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.541316 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts" (OuterVolumeSpecName: "scripts") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.549353 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.575764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data" (OuterVolumeSpecName: "config-data") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.583739 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e325c578-6909-4a00-8a16-436f430a8071" (UID: "e325c578-6909-4a00-8a16-436f430a8071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638621 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638670 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638688 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638704 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638720 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmln6\" (UniqueName: \"kubernetes.io/projected/e325c578-6909-4a00-8a16-436f430a8071-kube-api-access-bmln6\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.638737 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e325c578-6909-4a00-8a16-436f430a8071-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.762632 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bknbb" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="registry-server" probeResult="failure" output=< Mar 11 02:26:13 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 02:26:13 crc kubenswrapper[4744]: > Mar 11 02:26:13 crc kubenswrapper[4744]: I0311 02:26:13.993420 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" path="/var/lib/kubelet/pods/4f1a10b7-2665-4a3b-a85d-f0e4831d8e24/volumes" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.062939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w2gq" event={"ID":"e325c578-6909-4a00-8a16-436f430a8071","Type":"ContainerDied","Data":"556a3ce628bc13a44203310e0cd21bcf55297b039ec24793c6a689a2fe5fb209"} Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.062998 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556a3ce628bc13a44203310e0cd21bcf55297b039ec24793c6a689a2fe5fb209" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.063020 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w2gq" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.208433 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-84c66b68ff-jvb9q"] Mar 11 02:26:14 crc kubenswrapper[4744]: E0311 02:26:14.209095 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e325c578-6909-4a00-8a16-436f430a8071" containerName="keystone-bootstrap" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.209117 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e325c578-6909-4a00-8a16-436f430a8071" containerName="keystone-bootstrap" Mar 11 02:26:14 crc kubenswrapper[4744]: E0311 02:26:14.209129 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="init" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.209137 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="init" Mar 11 02:26:14 crc kubenswrapper[4744]: E0311 02:26:14.209150 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="dnsmasq-dns" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.209159 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="dnsmasq-dns" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.209369 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1a10b7-2665-4a3b-a85d-f0e4831d8e24" containerName="dnsmasq-dns" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.209396 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e325c578-6909-4a00-8a16-436f430a8071" containerName="keystone-bootstrap" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.210041 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.214834 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.215152 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.215773 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.215985 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkpp9" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.216127 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.216912 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.239275 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84c66b68ff-jvb9q"] Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvlv\" (UniqueName: \"kubernetes.io/projected/8eda2e74-8afe-403d-b20d-b3953a3bed0f-kube-api-access-5nvlv\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352167 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-combined-ca-bundle\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-scripts\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352243 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-internal-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-fernet-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-credential-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352313 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-config-data\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.352341 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-public-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.453688 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-combined-ca-bundle\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.453810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-scripts\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.453871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-internal-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.453940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-fernet-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.453984 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-credential-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.454016 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-config-data\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.454060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-public-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.454290 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvlv\" (UniqueName: \"kubernetes.io/projected/8eda2e74-8afe-403d-b20d-b3953a3bed0f-kube-api-access-5nvlv\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.459918 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-scripts\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.460461 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-combined-ca-bundle\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.461074 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-credential-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.461124 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-public-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.463204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-config-data\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.467196 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-fernet-keys\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.467749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eda2e74-8afe-403d-b20d-b3953a3bed0f-internal-tls-certs\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.475187 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvlv\" (UniqueName: \"kubernetes.io/projected/8eda2e74-8afe-403d-b20d-b3953a3bed0f-kube-api-access-5nvlv\") pod \"keystone-84c66b68ff-jvb9q\" (UID: \"8eda2e74-8afe-403d-b20d-b3953a3bed0f\") " pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:14 crc kubenswrapper[4744]: I0311 02:26:14.537465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:15 crc kubenswrapper[4744]: I0311 02:26:15.007848 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84c66b68ff-jvb9q"] Mar 11 02:26:15 crc kubenswrapper[4744]: I0311 02:26:15.083739 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84c66b68ff-jvb9q" event={"ID":"8eda2e74-8afe-403d-b20d-b3953a3bed0f","Type":"ContainerStarted","Data":"71d49ea4d97a400b8eeb106cd7650f8bb6c05939126371d19dc14b56a65b3ba3"} Mar 11 02:26:16 crc kubenswrapper[4744]: I0311 02:26:16.093634 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84c66b68ff-jvb9q" event={"ID":"8eda2e74-8afe-403d-b20d-b3953a3bed0f","Type":"ContainerStarted","Data":"26b0ad25027883928d61f103b97a07a2f6ed1d04316386255b6f4fa089f9c958"} Mar 11 02:26:16 crc kubenswrapper[4744]: I0311 02:26:16.094024 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:16 crc kubenswrapper[4744]: I0311 02:26:16.134076 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-84c66b68ff-jvb9q" podStartSLOduration=2.134043167 podStartE2EDuration="2.134043167s" podCreationTimestamp="2026-03-11 02:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:26:16.11824052 +0000 UTC m=+5532.922458125" watchObservedRunningTime="2026-03-11 02:26:16.134043167 +0000 UTC m=+5532.938260822" Mar 11 02:26:16 crc kubenswrapper[4744]: I0311 02:26:16.498031 4744 scope.go:117] "RemoveContainer" containerID="692f4e0f9c3819a33b328d73569888269a69e6a812161c921ef2c00811a5ca28" Mar 11 02:26:22 crc kubenswrapper[4744]: I0311 02:26:22.757500 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:22 crc kubenswrapper[4744]: I0311 02:26:22.842601 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:23 crc kubenswrapper[4744]: I0311 02:26:23.017862 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.182029 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bknbb" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="registry-server" containerID="cri-o://b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1" gracePeriod=2 Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.759137 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.849432 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2sw\" (UniqueName: \"kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw\") pod \"da90a95c-3594-4021-9a54-6efc0fcd1d10\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.849538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities\") pod \"da90a95c-3594-4021-9a54-6efc0fcd1d10\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.849575 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content\") pod \"da90a95c-3594-4021-9a54-6efc0fcd1d10\" (UID: \"da90a95c-3594-4021-9a54-6efc0fcd1d10\") " Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.850722 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities" (OuterVolumeSpecName: "utilities") pod "da90a95c-3594-4021-9a54-6efc0fcd1d10" (UID: "da90a95c-3594-4021-9a54-6efc0fcd1d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.857429 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw" (OuterVolumeSpecName: "kube-api-access-xz2sw") pod "da90a95c-3594-4021-9a54-6efc0fcd1d10" (UID: "da90a95c-3594-4021-9a54-6efc0fcd1d10"). InnerVolumeSpecName "kube-api-access-xz2sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.951554 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:24 crc kubenswrapper[4744]: I0311 02:26:24.951619 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2sw\" (UniqueName: \"kubernetes.io/projected/da90a95c-3594-4021-9a54-6efc0fcd1d10-kube-api-access-xz2sw\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.050363 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da90a95c-3594-4021-9a54-6efc0fcd1d10" (UID: "da90a95c-3594-4021-9a54-6efc0fcd1d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.053583 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da90a95c-3594-4021-9a54-6efc0fcd1d10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.193978 4744 generic.go:334] "Generic (PLEG): container finished" podID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerID="b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1" exitCode=0 Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.194322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerDied","Data":"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1"} Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.194364 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bknbb" event={"ID":"da90a95c-3594-4021-9a54-6efc0fcd1d10","Type":"ContainerDied","Data":"f47ef7472471a84aa8ffc47f10fffdf1185f3046ee6f45047a109e6f2e9ea556"} Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.194393 4744 scope.go:117] "RemoveContainer" containerID="b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.194593 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bknbb" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.230780 4744 scope.go:117] "RemoveContainer" containerID="7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.254196 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.260497 4744 scope.go:117] "RemoveContainer" containerID="b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.265410 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bknbb"] Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.306932 4744 scope.go:117] "RemoveContainer" containerID="b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1" Mar 11 02:26:25 crc kubenswrapper[4744]: E0311 02:26:25.307548 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1\": container with ID starting with b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1 not found: ID does not exist" containerID="b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.307610 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1"} err="failed to get container status \"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1\": rpc error: code = NotFound desc = could not find container \"b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1\": container with ID starting with b528fdc4d6977f8f8f20b92c5a1feb2ab036eec15919ee5749c9ae4b25cf22d1 not found: ID does not exist" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.307651 4744 scope.go:117] "RemoveContainer" containerID="7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5" Mar 11 02:26:25 crc kubenswrapper[4744]: E0311 02:26:25.308127 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5\": container with ID starting with 7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5 not found: ID does not exist" containerID="7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.308165 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5"} err="failed to get container status \"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5\": rpc error: code = NotFound desc = could not find container \"7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5\": container with ID starting with 7425ca425a7ac27cd7719f99c92f81b9431084f1b9df95971852a96adf536cb5 not found: ID does not exist" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.308191 4744 scope.go:117] "RemoveContainer" containerID="b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750" Mar 11 02:26:25 crc kubenswrapper[4744]: E0311 02:26:25.308580 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750\": container with ID starting with b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750 not found: ID does not exist" containerID="b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.308612 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750"} err="failed to get container status \"b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750\": rpc error: code = NotFound desc = could not find container \"b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750\": container with ID starting with b29a03b42fd9e311fed88e9dc86015718405a92912ec11b0783a54a6fc22c750 not found: ID does not exist" Mar 11 02:26:25 crc kubenswrapper[4744]: I0311 02:26:25.991083 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" path="/var/lib/kubelet/pods/da90a95c-3594-4021-9a54-6efc0fcd1d10/volumes" Mar 11 02:26:45 crc kubenswrapper[4744]: I0311 02:26:45.930023 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-84c66b68ff-jvb9q" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.374548 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 02:26:50 crc kubenswrapper[4744]: E0311 02:26:50.375756 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="extract-utilities" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.375781 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="extract-utilities" Mar 11 02:26:50 crc kubenswrapper[4744]: E0311 02:26:50.375850 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="extract-content" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.375865 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="extract-content" Mar 11 02:26:50 crc kubenswrapper[4744]: E0311 02:26:50.376077 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="registry-server" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.376096 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="registry-server" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.376415 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da90a95c-3594-4021-9a54-6efc0fcd1d10" containerName="registry-server" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.377461 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.389715 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.389974 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.391824 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wffsm" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.401458 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.534735 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4x7d\" (UniqueName: \"kubernetes.io/projected/b11cb88a-4127-48bb-9bb1-76c564f9d050-kube-api-access-d4x7d\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.534897 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config-secret\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.535033 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.535129 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.637732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.637991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4x7d\" (UniqueName: \"kubernetes.io/projected/b11cb88a-4127-48bb-9bb1-76c564f9d050-kube-api-access-d4x7d\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.638050 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config-secret\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.638095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.639823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.646578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.658966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b11cb88a-4127-48bb-9bb1-76c564f9d050-openstack-config-secret\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.661670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4x7d\" (UniqueName: \"kubernetes.io/projected/b11cb88a-4127-48bb-9bb1-76c564f9d050-kube-api-access-d4x7d\") pod \"openstackclient\" (UID: \"b11cb88a-4127-48bb-9bb1-76c564f9d050\") " pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.710867 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 02:26:50 crc kubenswrapper[4744]: I0311 02:26:50.999389 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 02:26:51 crc kubenswrapper[4744]: I0311 02:26:51.505722 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b11cb88a-4127-48bb-9bb1-76c564f9d050","Type":"ContainerStarted","Data":"757715dcc4f5f34874ca22ab982f4cfb47878969527df4e626a2dcd426c1f7d9"} Mar 11 02:26:51 crc kubenswrapper[4744]: I0311 02:26:51.506054 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b11cb88a-4127-48bb-9bb1-76c564f9d050","Type":"ContainerStarted","Data":"24b63c82ff0a8a36b546c80f998cf183ae2b7ae472b101dd4041094339d14ac5"} Mar 11 02:26:51 crc kubenswrapper[4744]: I0311 02:26:51.539146 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.539116839 podStartE2EDuration="1.539116839s" podCreationTimestamp="2026-03-11 02:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:26:51.528573474 +0000 UTC m=+5568.332791119" watchObservedRunningTime="2026-03-11 02:26:51.539116839 +0000 UTC m=+5568.343334484" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.134584 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553268-79s8p"] Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.136209 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.144018 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.144395 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.145881 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.159360 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553268-79s8p"] Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.241479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jbr\" (UniqueName: \"kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr\") pod \"auto-csr-approver-29553268-79s8p\" (UID: \"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1\") " pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.343611 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jbr\" (UniqueName: \"kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr\") pod \"auto-csr-approver-29553268-79s8p\" (UID: \"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1\") " pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.366295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jbr\" (UniqueName: \"kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr\") pod \"auto-csr-approver-29553268-79s8p\" (UID: \"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1\") " pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.452785 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:00 crc kubenswrapper[4744]: I0311 02:28:00.749787 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553268-79s8p"] Mar 11 02:28:01 crc kubenswrapper[4744]: I0311 02:28:01.298115 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553268-79s8p" event={"ID":"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1","Type":"ContainerStarted","Data":"a962f3215109316c2ac40432ecb3fbe7a7dd910754f552b996d05712a1664a1a"} Mar 11 02:28:02 crc kubenswrapper[4744]: I0311 02:28:02.308041 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553268-79s8p" event={"ID":"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1","Type":"ContainerStarted","Data":"2b1ed65fae0af39455b6a97f406ad98e8dfea27b325ab77d32a7c3ca47ef82ed"} Mar 11 02:28:02 crc kubenswrapper[4744]: I0311 02:28:02.326436 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553268-79s8p" podStartSLOduration=1.35042656 podStartE2EDuration="2.326418048s" podCreationTimestamp="2026-03-11 02:28:00 +0000 UTC" firstStartedPulling="2026-03-11 02:28:00.759290465 +0000 UTC m=+5637.563508060" lastFinishedPulling="2026-03-11 02:28:01.735281913 +0000 UTC m=+5638.539499548" observedRunningTime="2026-03-11 02:28:02.323365814 +0000 UTC m=+5639.127583449" watchObservedRunningTime="2026-03-11 02:28:02.326418048 +0000 UTC m=+5639.130635663" Mar 11 02:28:03 crc kubenswrapper[4744]: I0311 02:28:03.320249 4744 generic.go:334] "Generic (PLEG): container finished" podID="dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" containerID="2b1ed65fae0af39455b6a97f406ad98e8dfea27b325ab77d32a7c3ca47ef82ed" exitCode=0 Mar 11 02:28:03 crc kubenswrapper[4744]: I0311 02:28:03.320368 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553268-79s8p" event={"ID":"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1","Type":"ContainerDied","Data":"2b1ed65fae0af39455b6a97f406ad98e8dfea27b325ab77d32a7c3ca47ef82ed"} Mar 11 02:28:04 crc kubenswrapper[4744]: I0311 02:28:04.733833 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:04 crc kubenswrapper[4744]: I0311 02:28:04.824214 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jbr\" (UniqueName: \"kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr\") pod \"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1\" (UID: \"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1\") " Mar 11 02:28:04 crc kubenswrapper[4744]: I0311 02:28:04.833769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr" (OuterVolumeSpecName: "kube-api-access-m5jbr") pod "dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" (UID: "dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1"). InnerVolumeSpecName "kube-api-access-m5jbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:28:04 crc kubenswrapper[4744]: I0311 02:28:04.926736 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jbr\" (UniqueName: \"kubernetes.io/projected/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1-kube-api-access-m5jbr\") on node \"crc\" DevicePath \"\"" Mar 11 02:28:05 crc kubenswrapper[4744]: I0311 02:28:05.356371 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553268-79s8p" event={"ID":"dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1","Type":"ContainerDied","Data":"a962f3215109316c2ac40432ecb3fbe7a7dd910754f552b996d05712a1664a1a"} Mar 11 02:28:05 crc kubenswrapper[4744]: I0311 02:28:05.356836 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a962f3215109316c2ac40432ecb3fbe7a7dd910754f552b996d05712a1664a1a" Mar 11 02:28:05 crc kubenswrapper[4744]: I0311 02:28:05.356627 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553268-79s8p" Mar 11 02:28:05 crc kubenswrapper[4744]: I0311 02:28:05.424223 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553262-z2v47"] Mar 11 02:28:05 crc kubenswrapper[4744]: I0311 02:28:05.433936 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553262-z2v47"] Mar 11 02:28:06 crc kubenswrapper[4744]: I0311 02:28:05.998124 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b424dd-5928-4b87-ba28-83de9295aa31" path="/var/lib/kubelet/pods/f7b424dd-5928-4b87-ba28-83de9295aa31/volumes" Mar 11 02:28:12 crc kubenswrapper[4744]: I0311 02:28:12.409097 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:28:12 crc kubenswrapper[4744]: I0311 02:28:12.409727 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:28:16 crc kubenswrapper[4744]: I0311 02:28:16.706453 4744 scope.go:117] "RemoveContainer" containerID="4208c0a9f8464658ee51ac3024f1ca010869bf0f60c3bdcacab6d78eb48ad219" Mar 11 02:28:41 crc kubenswrapper[4744]: I0311 02:28:41.072502 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mpb95"] Mar 11 02:28:41 crc kubenswrapper[4744]: I0311 02:28:41.086142 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mpb95"] Mar 11 02:28:41 crc kubenswrapper[4744]: I0311 02:28:41.995726 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5bbcff7-a221-4aeb-8859-da7a80f83ab7" path="/var/lib/kubelet/pods/b5bbcff7-a221-4aeb-8859-da7a80f83ab7/volumes" Mar 11 02:28:42 crc kubenswrapper[4744]: I0311 02:28:42.409752 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:28:42 crc kubenswrapper[4744]: I0311 02:28:42.409837 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.299653 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:28:46 crc kubenswrapper[4744]: E0311 02:28:46.300538 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" containerName="oc" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.300560 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" containerName="oc" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.300843 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" containerName="oc" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.302919 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.309390 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.428241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.428462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wml2\" (UniqueName: \"kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.428632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.530348 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wml2\" (UniqueName: \"kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.530450 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.530581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.531215 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.531362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.569573 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wml2\" (UniqueName: \"kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2\") pod \"certified-operators-dp9mz\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:46 crc kubenswrapper[4744]: I0311 02:28:46.631860 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:47 crc kubenswrapper[4744]: I0311 02:28:47.156107 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:28:47 crc kubenswrapper[4744]: I0311 02:28:47.808282 4744 generic.go:334] "Generic (PLEG): container finished" podID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerID="3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43" exitCode=0 Mar 11 02:28:47 crc kubenswrapper[4744]: I0311 02:28:47.808347 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerDied","Data":"3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43"} Mar 11 02:28:47 crc kubenswrapper[4744]: I0311 02:28:47.808385 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerStarted","Data":"6b94c01b3f1df528e991089126a3a6644dbe055839d80ce49bcad14fa8016b54"} Mar 11 02:28:49 crc kubenswrapper[4744]: I0311 02:28:49.824951 4744 generic.go:334] "Generic (PLEG): container finished" podID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerID="f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590" exitCode=0 Mar 11 02:28:49 crc kubenswrapper[4744]: I0311 02:28:49.825038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerDied","Data":"f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590"} Mar 11 02:28:50 crc kubenswrapper[4744]: I0311 02:28:50.838546 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerStarted","Data":"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194"} Mar 11 02:28:50 crc kubenswrapper[4744]: I0311 02:28:50.863456 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dp9mz" podStartSLOduration=2.454151867 podStartE2EDuration="4.863432785s" podCreationTimestamp="2026-03-11 02:28:46 +0000 UTC" firstStartedPulling="2026-03-11 02:28:47.810869585 +0000 UTC m=+5684.615087200" lastFinishedPulling="2026-03-11 02:28:50.220150513 +0000 UTC m=+5687.024368118" observedRunningTime="2026-03-11 02:28:50.855299234 +0000 UTC m=+5687.659516849" watchObservedRunningTime="2026-03-11 02:28:50.863432785 +0000 UTC m=+5687.667650400" Mar 11 02:28:56 crc kubenswrapper[4744]: I0311 02:28:56.632871 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:56 crc kubenswrapper[4744]: I0311 02:28:56.634736 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:56 crc kubenswrapper[4744]: I0311 02:28:56.714050 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:56 crc kubenswrapper[4744]: I0311 02:28:56.991894 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:57 crc kubenswrapper[4744]: I0311 02:28:57.056238 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:28:58 crc kubenswrapper[4744]: I0311 02:28:58.935313 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dp9mz" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="registry-server" containerID="cri-o://eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194" gracePeriod=2 Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.516930 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.595209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities\") pod \"1cf9a239-6beb-4d4d-813f-e6801193bba2\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.595431 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wml2\" (UniqueName: \"kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2\") pod \"1cf9a239-6beb-4d4d-813f-e6801193bba2\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.596493 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities" (OuterVolumeSpecName: "utilities") pod "1cf9a239-6beb-4d4d-813f-e6801193bba2" (UID: "1cf9a239-6beb-4d4d-813f-e6801193bba2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.597643 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content\") pod \"1cf9a239-6beb-4d4d-813f-e6801193bba2\" (UID: \"1cf9a239-6beb-4d4d-813f-e6801193bba2\") " Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.599048 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.611531 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2" (OuterVolumeSpecName: "kube-api-access-8wml2") pod "1cf9a239-6beb-4d4d-813f-e6801193bba2" (UID: "1cf9a239-6beb-4d4d-813f-e6801193bba2"). InnerVolumeSpecName "kube-api-access-8wml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.700313 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wml2\" (UniqueName: \"kubernetes.io/projected/1cf9a239-6beb-4d4d-813f-e6801193bba2-kube-api-access-8wml2\") on node \"crc\" DevicePath \"\"" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.949308 4744 generic.go:334] "Generic (PLEG): container finished" podID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerID="eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194" exitCode=0 Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.949421 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dp9mz" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.949466 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerDied","Data":"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194"} Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.949660 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dp9mz" event={"ID":"1cf9a239-6beb-4d4d-813f-e6801193bba2","Type":"ContainerDied","Data":"6b94c01b3f1df528e991089126a3a6644dbe055839d80ce49bcad14fa8016b54"} Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.949697 4744 scope.go:117] "RemoveContainer" containerID="eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194" Mar 11 02:28:59 crc kubenswrapper[4744]: I0311 02:28:59.993381 4744 scope.go:117] "RemoveContainer" containerID="f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.023842 4744 scope.go:117] "RemoveContainer" containerID="3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.040645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cf9a239-6beb-4d4d-813f-e6801193bba2" (UID: "1cf9a239-6beb-4d4d-813f-e6801193bba2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.067626 4744 scope.go:117] "RemoveContainer" containerID="eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194" Mar 11 02:29:00 crc kubenswrapper[4744]: E0311 02:29:00.068432 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194\": container with ID starting with eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194 not found: ID does not exist" containerID="eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.068478 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194"} err="failed to get container status \"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194\": rpc error: code = NotFound desc = could not find container \"eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194\": container with ID starting with eef56650595b4679f9aaeff952779a91bce1d9e4e4aa83d7ca689165de7b9194 not found: ID does not exist" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.068507 4744 scope.go:117] "RemoveContainer" containerID="f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590" Mar 11 02:29:00 crc kubenswrapper[4744]: E0311 02:29:00.069404 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590\": container with ID starting with f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590 not found: ID does not exist" containerID="f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.069558 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590"} err="failed to get container status \"f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590\": rpc error: code = NotFound desc = could not find container \"f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590\": container with ID starting with f459476d2ffda7a2503fd8e89df06f4693641cf3d5ea962a992f6a090b30a590 not found: ID does not exist" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.069669 4744 scope.go:117] "RemoveContainer" containerID="3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43" Mar 11 02:29:00 crc kubenswrapper[4744]: E0311 02:29:00.070336 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43\": container with ID starting with 3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43 not found: ID does not exist" containerID="3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.070463 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43"} err="failed to get container status \"3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43\": rpc error: code = NotFound desc = could not find container \"3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43\": container with ID starting with 3674d8e8006e1ae39933c41ab8292697637cd3cfcf78a4acf9d7e90416f2bb43 not found: ID does not exist" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.109991 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf9a239-6beb-4d4d-813f-e6801193bba2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.292704 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:29:00 crc kubenswrapper[4744]: I0311 02:29:00.318117 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dp9mz"] Mar 11 02:29:01 crc kubenswrapper[4744]: I0311 02:29:01.992075 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" path="/var/lib/kubelet/pods/1cf9a239-6beb-4d4d-813f-e6801193bba2/volumes" Mar 11 02:29:12 crc kubenswrapper[4744]: I0311 02:29:12.409636 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:29:12 crc kubenswrapper[4744]: I0311 02:29:12.410445 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:29:12 crc kubenswrapper[4744]: I0311 02:29:12.410508 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:29:12 crc kubenswrapper[4744]: I0311 02:29:12.411292 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:29:12 crc kubenswrapper[4744]: I0311 02:29:12.411386 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" gracePeriod=600 Mar 11 02:29:12 crc kubenswrapper[4744]: E0311 02:29:12.545461 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:29:13 crc kubenswrapper[4744]: I0311 02:29:13.130346 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" exitCode=0 Mar 11 02:29:13 crc kubenswrapper[4744]: I0311 02:29:13.130415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690"} Mar 11 02:29:13 crc kubenswrapper[4744]: I0311 02:29:13.130805 4744 scope.go:117] "RemoveContainer" containerID="8aecf29c05b573f43c0dc7af38620d95b6c4462b39682ce2f5e33235c90e00ed" Mar 11 02:29:13 crc kubenswrapper[4744]: I0311 02:29:13.131754 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:29:13 crc kubenswrapper[4744]: E0311 02:29:13.132165 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:29:16 crc kubenswrapper[4744]: I0311 02:29:16.802961 4744 scope.go:117] "RemoveContainer" containerID="5e1d2b9bc037730517f6b7f6b2300207d7a7f44cf7bd7d17f878b615b65139e6" Mar 11 02:29:25 crc kubenswrapper[4744]: I0311 02:29:25.975749 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:29:25 crc kubenswrapper[4744]: E0311 02:29:25.976774 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:29:37 crc kubenswrapper[4744]: I0311 02:29:37.975090 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:29:37 crc kubenswrapper[4744]: E0311 02:29:37.975988 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:29:52 crc kubenswrapper[4744]: I0311 02:29:52.975261 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:29:52 crc kubenswrapper[4744]: E0311 02:29:52.977417 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.167553 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2"] Mar 11 02:30:00 crc kubenswrapper[4744]: E0311 02:30:00.168816 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="extract-content" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.168882 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="extract-content" Mar 11 02:30:00 crc kubenswrapper[4744]: E0311 02:30:00.168900 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="registry-server" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.168914 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="registry-server" Mar 11 02:30:00 crc kubenswrapper[4744]: E0311 02:30:00.168951 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="extract-utilities" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.168968 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="extract-utilities" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.169294 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf9a239-6beb-4d4d-813f-e6801193bba2" containerName="registry-server" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.170209 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.172574 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.173273 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.185134 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2"] Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.258542 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.258879 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtp55\" (UniqueName: \"kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.258963 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.264187 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553270-xv2sr"] Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.266034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.269392 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.269832 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.269952 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.291006 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553270-xv2sr"] Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.360128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.360251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.360302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtp55\" (UniqueName: \"kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.360344 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctnql\" (UniqueName: \"kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql\") pod \"auto-csr-approver-29553270-xv2sr\" (UID: \"0055e672-0093-4766-ad33-335b68aec14c\") " pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.362097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.378202 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.385753 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtp55\" (UniqueName: \"kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55\") pod \"collect-profiles-29553270-sg6h2\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.462448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctnql\" (UniqueName: \"kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql\") pod \"auto-csr-approver-29553270-xv2sr\" (UID: \"0055e672-0093-4766-ad33-335b68aec14c\") " pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.483597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctnql\" (UniqueName: \"kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql\") pod \"auto-csr-approver-29553270-xv2sr\" (UID: \"0055e672-0093-4766-ad33-335b68aec14c\") " pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.506805 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.589760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:00 crc kubenswrapper[4744]: I0311 02:30:00.962228 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2"] Mar 11 02:30:01 crc kubenswrapper[4744]: I0311 02:30:01.049890 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553270-xv2sr"] Mar 11 02:30:01 crc kubenswrapper[4744]: W0311 02:30:01.055331 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0055e672_0093_4766_ad33_335b68aec14c.slice/crio-28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88 WatchSource:0}: Error finding container 28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88: Status 404 returned error can't find the container with id 28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88 Mar 11 02:30:01 crc kubenswrapper[4744]: I0311 02:30:01.642956 4744 generic.go:334] "Generic (PLEG): container finished" podID="1204a779-7d0f-48dc-bf95-b3cd3fca74e6" containerID="fa51156438e6718f6fdb0e66f6e59a7337e3046cc7a5a85341d280655aff6563" exitCode=0 Mar 11 02:30:01 crc kubenswrapper[4744]: I0311 02:30:01.643038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" event={"ID":"1204a779-7d0f-48dc-bf95-b3cd3fca74e6","Type":"ContainerDied","Data":"fa51156438e6718f6fdb0e66f6e59a7337e3046cc7a5a85341d280655aff6563"} Mar 11 02:30:01 crc kubenswrapper[4744]: I0311 02:30:01.643379 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" event={"ID":"1204a779-7d0f-48dc-bf95-b3cd3fca74e6","Type":"ContainerStarted","Data":"45cdd85af626d02f477dc58e9b3f9fa824e1555520eca1b2908e08d6ae1e60cc"} Mar 11 02:30:01 crc kubenswrapper[4744]: I0311 02:30:01.645183 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" event={"ID":"0055e672-0093-4766-ad33-335b68aec14c","Type":"ContainerStarted","Data":"28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88"} Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.094097 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.220985 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume\") pod \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.221111 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtp55\" (UniqueName: \"kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55\") pod \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.221217 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume\") pod \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\" (UID: \"1204a779-7d0f-48dc-bf95-b3cd3fca74e6\") " Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.223294 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1204a779-7d0f-48dc-bf95-b3cd3fca74e6" (UID: "1204a779-7d0f-48dc-bf95-b3cd3fca74e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.229465 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1204a779-7d0f-48dc-bf95-b3cd3fca74e6" (UID: "1204a779-7d0f-48dc-bf95-b3cd3fca74e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.240812 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55" (OuterVolumeSpecName: "kube-api-access-vtp55") pod "1204a779-7d0f-48dc-bf95-b3cd3fca74e6" (UID: "1204a779-7d0f-48dc-bf95-b3cd3fca74e6"). InnerVolumeSpecName "kube-api-access-vtp55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.323955 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.324004 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.324025 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtp55\" (UniqueName: \"kubernetes.io/projected/1204a779-7d0f-48dc-bf95-b3cd3fca74e6-kube-api-access-vtp55\") on node \"crc\" DevicePath \"\"" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.668668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" event={"ID":"1204a779-7d0f-48dc-bf95-b3cd3fca74e6","Type":"ContainerDied","Data":"45cdd85af626d02f477dc58e9b3f9fa824e1555520eca1b2908e08d6ae1e60cc"} Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.669056 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cdd85af626d02f477dc58e9b3f9fa824e1555520eca1b2908e08d6ae1e60cc" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.668698 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553270-sg6h2" Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.671164 4744 generic.go:334] "Generic (PLEG): container finished" podID="0055e672-0093-4766-ad33-335b68aec14c" containerID="a7289668482f737751be18b35d1ae34a24da27d977aa5399f17716f7c760cc89" exitCode=0 Mar 11 02:30:03 crc kubenswrapper[4744]: I0311 02:30:03.671228 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" event={"ID":"0055e672-0093-4766-ad33-335b68aec14c","Type":"ContainerDied","Data":"a7289668482f737751be18b35d1ae34a24da27d977aa5399f17716f7c760cc89"} Mar 11 02:30:04 crc kubenswrapper[4744]: I0311 02:30:04.176643 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2"] Mar 11 02:30:04 crc kubenswrapper[4744]: I0311 02:30:04.182870 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553225-5xsk2"] Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.065631 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.159763 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctnql\" (UniqueName: \"kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql\") pod \"0055e672-0093-4766-ad33-335b68aec14c\" (UID: \"0055e672-0093-4766-ad33-335b68aec14c\") " Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.171339 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql" (OuterVolumeSpecName: "kube-api-access-ctnql") pod "0055e672-0093-4766-ad33-335b68aec14c" (UID: "0055e672-0093-4766-ad33-335b68aec14c"). InnerVolumeSpecName "kube-api-access-ctnql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.261726 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctnql\" (UniqueName: \"kubernetes.io/projected/0055e672-0093-4766-ad33-335b68aec14c-kube-api-access-ctnql\") on node \"crc\" DevicePath \"\"" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.689290 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" event={"ID":"0055e672-0093-4766-ad33-335b68aec14c","Type":"ContainerDied","Data":"28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88"} Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.689326 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f9ec45eda0f7a4b7e6fd79bc8efa34a0137a9ecb536b3c839105309df71f88" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.689367 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553270-xv2sr" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.975160 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:30:05 crc kubenswrapper[4744]: E0311 02:30:05.975872 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:30:05 crc kubenswrapper[4744]: I0311 02:30:05.995296 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a8af84-d779-4bd2-860d-adc73f3b225f" path="/var/lib/kubelet/pods/10a8af84-d779-4bd2-860d-adc73f3b225f/volumes" Mar 11 02:30:06 crc kubenswrapper[4744]: I0311 02:30:06.145567 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553264-55vm5"] Mar 11 02:30:06 crc kubenswrapper[4744]: I0311 02:30:06.151697 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553264-55vm5"] Mar 11 02:30:07 crc kubenswrapper[4744]: I0311 02:30:07.992187 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1c5f90-fd1b-4043-8052-f79cc0531a9b" path="/var/lib/kubelet/pods/7e1c5f90-fd1b-4043-8052-f79cc0531a9b/volumes" Mar 11 02:30:16 crc kubenswrapper[4744]: I0311 02:30:16.887327 4744 scope.go:117] "RemoveContainer" containerID="3897da90f7fff9a2b982be7ee015b74df608377bc2da88fc66e46b5c84cb5f26" Mar 11 02:30:16 crc kubenswrapper[4744]: I0311 02:30:16.955282 4744 scope.go:117] "RemoveContainer" containerID="1bba16946fdfb8a679db4d8e1f701b175df943377be9bd0206938c6bc0d528e1" Mar 11 02:30:19 crc kubenswrapper[4744]: I0311 02:30:19.975083 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:30:19 crc kubenswrapper[4744]: E0311 02:30:19.975532 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:30:31 crc kubenswrapper[4744]: I0311 02:30:31.975279 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:30:31 crc kubenswrapper[4744]: E0311 02:30:31.976364 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:30:42 crc kubenswrapper[4744]: I0311 02:30:42.975144 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:30:42 crc kubenswrapper[4744]: E0311 02:30:42.978454 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:30:53 crc kubenswrapper[4744]: I0311 02:30:53.985929 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:30:53 crc kubenswrapper[4744]: E0311 02:30:53.988590 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:31:06 crc kubenswrapper[4744]: I0311 02:31:06.975150 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:31:06 crc kubenswrapper[4744]: E0311 02:31:06.977787 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:31:17 crc kubenswrapper[4744]: I0311 02:31:17.081383 4744 scope.go:117] "RemoveContainer" containerID="e00093d931af12eea3fd89b3a8fd324d5304ee7099841b353d9ed35509df9a39" Mar 11 02:31:17 crc kubenswrapper[4744]: I0311 02:31:17.976240 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:31:17 crc kubenswrapper[4744]: E0311 02:31:17.976418 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:31:29 crc kubenswrapper[4744]: I0311 02:31:29.975401 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:31:29 crc kubenswrapper[4744]: E0311 02:31:29.976592 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:31:43 crc kubenswrapper[4744]: I0311 02:31:43.980437 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:31:43 crc kubenswrapper[4744]: E0311 02:31:43.981188 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:31:56 crc kubenswrapper[4744]: I0311 02:31:56.976678 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:31:56 crc kubenswrapper[4744]: E0311 02:31:56.977824 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.174327 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553272-8f29r"] Mar 11 02:32:00 crc kubenswrapper[4744]: E0311 02:32:00.175366 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0055e672-0093-4766-ad33-335b68aec14c" containerName="oc" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.175390 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0055e672-0093-4766-ad33-335b68aec14c" containerName="oc" Mar 11 02:32:00 crc kubenswrapper[4744]: E0311 02:32:00.175444 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1204a779-7d0f-48dc-bf95-b3cd3fca74e6" containerName="collect-profiles" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.175458 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1204a779-7d0f-48dc-bf95-b3cd3fca74e6" containerName="collect-profiles" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.175775 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1204a779-7d0f-48dc-bf95-b3cd3fca74e6" containerName="collect-profiles" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.175806 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0055e672-0093-4766-ad33-335b68aec14c" containerName="oc" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.176758 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.179992 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.180144 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.180880 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.185670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553272-8f29r"] Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.355484 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrh8\" (UniqueName: \"kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8\") pod \"auto-csr-approver-29553272-8f29r\" (UID: \"58c3971f-b748-4eb0-a902-c0c87c4e8186\") " pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.457606 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrh8\" (UniqueName: \"kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8\") pod \"auto-csr-approver-29553272-8f29r\" (UID: \"58c3971f-b748-4eb0-a902-c0c87c4e8186\") " pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.493172 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrh8\" (UniqueName: \"kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8\") pod \"auto-csr-approver-29553272-8f29r\" (UID: \"58c3971f-b748-4eb0-a902-c0c87c4e8186\") " pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:00 crc kubenswrapper[4744]: I0311 02:32:00.517111 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:01 crc kubenswrapper[4744]: I0311 02:32:01.024970 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553272-8f29r"] Mar 11 02:32:01 crc kubenswrapper[4744]: W0311 02:32:01.036413 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c3971f_b748_4eb0_a902_c0c87c4e8186.slice/crio-e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249 WatchSource:0}: Error finding container e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249: Status 404 returned error can't find the container with id e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249 Mar 11 02:32:01 crc kubenswrapper[4744]: I0311 02:32:01.040343 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:32:01 crc kubenswrapper[4744]: I0311 02:32:01.935962 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553272-8f29r" event={"ID":"58c3971f-b748-4eb0-a902-c0c87c4e8186","Type":"ContainerStarted","Data":"e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249"} Mar 11 02:32:02 crc kubenswrapper[4744]: I0311 02:32:02.952225 4744 generic.go:334] "Generic (PLEG): container finished" podID="58c3971f-b748-4eb0-a902-c0c87c4e8186" containerID="6fb87143ba994c47523a11bfa21d928076ef019c5d15b0cbc3c8ff9439f7c502" exitCode=0 Mar 11 02:32:02 crc kubenswrapper[4744]: I0311 02:32:02.952664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553272-8f29r" event={"ID":"58c3971f-b748-4eb0-a902-c0c87c4e8186","Type":"ContainerDied","Data":"6fb87143ba994c47523a11bfa21d928076ef019c5d15b0cbc3c8ff9439f7c502"} Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.414788 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.543898 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrrh8\" (UniqueName: \"kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8\") pod \"58c3971f-b748-4eb0-a902-c0c87c4e8186\" (UID: \"58c3971f-b748-4eb0-a902-c0c87c4e8186\") " Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.552849 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8" (OuterVolumeSpecName: "kube-api-access-mrrh8") pod "58c3971f-b748-4eb0-a902-c0c87c4e8186" (UID: "58c3971f-b748-4eb0-a902-c0c87c4e8186"). InnerVolumeSpecName "kube-api-access-mrrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.645558 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrrh8\" (UniqueName: \"kubernetes.io/projected/58c3971f-b748-4eb0-a902-c0c87c4e8186-kube-api-access-mrrh8\") on node \"crc\" DevicePath \"\"" Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.977300 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553272-8f29r" event={"ID":"58c3971f-b748-4eb0-a902-c0c87c4e8186","Type":"ContainerDied","Data":"e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249"} Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.977360 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ff073926f4664b9ae60188063e48542f76228f938cf890a67aa7caf75cb249" Mar 11 02:32:04 crc kubenswrapper[4744]: I0311 02:32:04.977446 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553272-8f29r" Mar 11 02:32:05 crc kubenswrapper[4744]: I0311 02:32:05.507563 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553266-2p9pn"] Mar 11 02:32:05 crc kubenswrapper[4744]: I0311 02:32:05.516461 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553266-2p9pn"] Mar 11 02:32:05 crc kubenswrapper[4744]: I0311 02:32:05.989834 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727a4f4b-1972-442d-a7e0-35607159db4a" path="/var/lib/kubelet/pods/727a4f4b-1972-442d-a7e0-35607159db4a/volumes" Mar 11 02:32:09 crc kubenswrapper[4744]: I0311 02:32:09.986591 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:32:09 crc kubenswrapper[4744]: E0311 02:32:09.987900 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:32:17 crc kubenswrapper[4744]: I0311 02:32:17.199453 4744 scope.go:117] "RemoveContainer" containerID="1049a69a35821280aaf43a1fac4134418d68046771aaeedcf51feb8bb6b1ef56" Mar 11 02:32:17 crc kubenswrapper[4744]: I0311 02:32:17.236753 4744 scope.go:117] "RemoveContainer" containerID="1a5b0273748bd5eaafaace5f1448a1d0f51d979c5bb0dd338f4fb22e28db547a" Mar 11 02:32:17 crc kubenswrapper[4744]: I0311 02:32:17.298795 4744 scope.go:117] "RemoveContainer" containerID="d731f0365636e990ae07735281ccf07e9e3e9faffa874566428948e60f8fdcf5" Mar 11 02:32:17 crc kubenswrapper[4744]: I0311 02:32:17.365772 4744 scope.go:117] "RemoveContainer" containerID="a662d89bda27c07cba467b0d3fbbd711e592b119debb46681ce2dd0c7b97beac" Mar 11 02:32:22 crc kubenswrapper[4744]: I0311 02:32:22.975308 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:32:22 crc kubenswrapper[4744]: E0311 02:32:22.976450 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:32:35 crc kubenswrapper[4744]: I0311 02:32:35.975311 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:32:35 crc kubenswrapper[4744]: E0311 02:32:35.976133 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:32:46 crc kubenswrapper[4744]: I0311 02:32:46.975475 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:32:46 crc kubenswrapper[4744]: E0311 02:32:46.976491 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:33:01 crc kubenswrapper[4744]: I0311 02:33:01.975501 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:33:01 crc kubenswrapper[4744]: E0311 02:33:01.976672 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:33:12 crc kubenswrapper[4744]: I0311 02:33:12.975563 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:33:12 crc kubenswrapper[4744]: E0311 02:33:12.976633 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:33:25 crc kubenswrapper[4744]: I0311 02:33:25.976636 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:33:25 crc kubenswrapper[4744]: E0311 02:33:25.977900 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:33:38 crc kubenswrapper[4744]: I0311 02:33:38.975273 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:33:38 crc kubenswrapper[4744]: E0311 02:33:38.976303 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:33:53 crc kubenswrapper[4744]: I0311 02:33:53.986586 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:33:53 crc kubenswrapper[4744]: E0311 02:33:53.987489 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.152881 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553274-4z95m"] Mar 11 02:34:00 crc kubenswrapper[4744]: E0311 02:34:00.153866 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c3971f-b748-4eb0-a902-c0c87c4e8186" containerName="oc" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.153882 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c3971f-b748-4eb0-a902-c0c87c4e8186" containerName="oc" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.154096 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c3971f-b748-4eb0-a902-c0c87c4e8186" containerName="oc" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.154791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.157964 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.159428 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.162759 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.194943 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553274-4z95m"] Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.220126 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rpn\" (UniqueName: \"kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn\") pod \"auto-csr-approver-29553274-4z95m\" (UID: \"5ca71392-94d0-4b70-9dd7-2acb0deda0e0\") " pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.322972 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rpn\" (UniqueName: \"kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn\") pod \"auto-csr-approver-29553274-4z95m\" (UID: \"5ca71392-94d0-4b70-9dd7-2acb0deda0e0\") " pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.351024 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rpn\" (UniqueName: \"kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn\") pod \"auto-csr-approver-29553274-4z95m\" (UID: \"5ca71392-94d0-4b70-9dd7-2acb0deda0e0\") " pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.479564 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:00 crc kubenswrapper[4744]: I0311 02:34:00.799721 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553274-4z95m"] Mar 11 02:34:01 crc kubenswrapper[4744]: I0311 02:34:01.273154 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553274-4z95m" event={"ID":"5ca71392-94d0-4b70-9dd7-2acb0deda0e0","Type":"ContainerStarted","Data":"7f994033738f64008f9a5298eafecebfdaa23ebccc2a2b0ae0d8cae2fd80b103"} Mar 11 02:34:02 crc kubenswrapper[4744]: I0311 02:34:02.282266 4744 generic.go:334] "Generic (PLEG): container finished" podID="5ca71392-94d0-4b70-9dd7-2acb0deda0e0" containerID="d8ffc88fe2600f1c9b1fc84804dca89e6266e47e67064e0fb6c342fa47bd3c10" exitCode=0 Mar 11 02:34:02 crc kubenswrapper[4744]: I0311 02:34:02.282399 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553274-4z95m" event={"ID":"5ca71392-94d0-4b70-9dd7-2acb0deda0e0","Type":"ContainerDied","Data":"d8ffc88fe2600f1c9b1fc84804dca89e6266e47e67064e0fb6c342fa47bd3c10"} Mar 11 02:34:03 crc kubenswrapper[4744]: I0311 02:34:03.722436 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:03 crc kubenswrapper[4744]: I0311 02:34:03.789003 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rpn\" (UniqueName: \"kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn\") pod \"5ca71392-94d0-4b70-9dd7-2acb0deda0e0\" (UID: \"5ca71392-94d0-4b70-9dd7-2acb0deda0e0\") " Mar 11 02:34:03 crc kubenswrapper[4744]: I0311 02:34:03.798950 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn" (OuterVolumeSpecName: "kube-api-access-99rpn") pod "5ca71392-94d0-4b70-9dd7-2acb0deda0e0" (UID: "5ca71392-94d0-4b70-9dd7-2acb0deda0e0"). InnerVolumeSpecName "kube-api-access-99rpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:34:03 crc kubenswrapper[4744]: I0311 02:34:03.892065 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rpn\" (UniqueName: \"kubernetes.io/projected/5ca71392-94d0-4b70-9dd7-2acb0deda0e0-kube-api-access-99rpn\") on node \"crc\" DevicePath \"\"" Mar 11 02:34:04 crc kubenswrapper[4744]: I0311 02:34:04.304460 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553274-4z95m" event={"ID":"5ca71392-94d0-4b70-9dd7-2acb0deda0e0","Type":"ContainerDied","Data":"7f994033738f64008f9a5298eafecebfdaa23ebccc2a2b0ae0d8cae2fd80b103"} Mar 11 02:34:04 crc kubenswrapper[4744]: I0311 02:34:04.304759 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f994033738f64008f9a5298eafecebfdaa23ebccc2a2b0ae0d8cae2fd80b103" Mar 11 02:34:04 crc kubenswrapper[4744]: I0311 02:34:04.304519 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553274-4z95m" Mar 11 02:34:04 crc kubenswrapper[4744]: I0311 02:34:04.803947 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553268-79s8p"] Mar 11 02:34:04 crc kubenswrapper[4744]: I0311 02:34:04.814423 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553268-79s8p"] Mar 11 02:34:06 crc kubenswrapper[4744]: I0311 02:34:06.007986 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1" path="/var/lib/kubelet/pods/dab956a3-b6e6-45d0-82ac-ba8e9c8c6dc1/volumes" Mar 11 02:34:06 crc kubenswrapper[4744]: I0311 02:34:06.974704 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:34:06 crc kubenswrapper[4744]: E0311 02:34:06.975732 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:34:17 crc kubenswrapper[4744]: I0311 02:34:17.474372 4744 scope.go:117] "RemoveContainer" containerID="2b1ed65fae0af39455b6a97f406ad98e8dfea27b325ab77d32a7c3ca47ef82ed" Mar 11 02:34:19 crc kubenswrapper[4744]: I0311 02:34:19.975790 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:34:20 crc kubenswrapper[4744]: I0311 02:34:20.466110 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f"} Mar 11 02:35:49 crc kubenswrapper[4744]: I0311 02:35:49.062369 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9v4nw"] Mar 11 02:35:49 crc kubenswrapper[4744]: I0311 02:35:49.078728 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9v4nw"] Mar 11 02:35:49 crc kubenswrapper[4744]: I0311 02:35:49.990926 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f595b1-576e-4f29-8a3f-5122b93accc9" path="/var/lib/kubelet/pods/f5f595b1-576e-4f29-8a3f-5122b93accc9/volumes" Mar 11 02:35:51 crc kubenswrapper[4744]: I0311 02:35:51.034982 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-505a-account-create-update-dqmkk"] Mar 11 02:35:51 crc kubenswrapper[4744]: I0311 02:35:51.043089 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-505a-account-create-update-dqmkk"] Mar 11 02:35:51 crc kubenswrapper[4744]: I0311 02:35:51.992856 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8540b4c-50af-4bd6-abba-8e9e820cf22c" path="/var/lib/kubelet/pods/b8540b4c-50af-4bd6-abba-8e9e820cf22c/volumes" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.046894 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-69x27"] Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.061955 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-69x27"] Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.186781 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553276-rgjfz"] Mar 11 02:36:00 crc kubenswrapper[4744]: E0311 02:36:00.187351 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca71392-94d0-4b70-9dd7-2acb0deda0e0" containerName="oc" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.187379 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca71392-94d0-4b70-9dd7-2acb0deda0e0" containerName="oc" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.187711 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca71392-94d0-4b70-9dd7-2acb0deda0e0" containerName="oc" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.188661 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.192227 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.192800 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.193451 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.202457 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553276-rgjfz"] Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.269145 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9bz\" (UniqueName: \"kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz\") pod \"auto-csr-approver-29553276-rgjfz\" (UID: \"d75d4a39-c076-4584-9ede-73f57c608141\") " pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.370999 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9bz\" (UniqueName: \"kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz\") pod \"auto-csr-approver-29553276-rgjfz\" (UID: \"d75d4a39-c076-4584-9ede-73f57c608141\") " pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.397416 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9bz\" (UniqueName: \"kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz\") pod \"auto-csr-approver-29553276-rgjfz\" (UID: \"d75d4a39-c076-4584-9ede-73f57c608141\") " pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:00 crc kubenswrapper[4744]: I0311 02:36:00.511272 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:01 crc kubenswrapper[4744]: I0311 02:36:01.017185 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553276-rgjfz"] Mar 11 02:36:01 crc kubenswrapper[4744]: W0311 02:36:01.031681 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd75d4a39_c076_4584_9ede_73f57c608141.slice/crio-687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3 WatchSource:0}: Error finding container 687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3: Status 404 returned error can't find the container with id 687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3 Mar 11 02:36:01 crc kubenswrapper[4744]: I0311 02:36:01.514188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" event={"ID":"d75d4a39-c076-4584-9ede-73f57c608141","Type":"ContainerStarted","Data":"687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3"} Mar 11 02:36:01 crc kubenswrapper[4744]: I0311 02:36:01.994735 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe36576-8433-45b2-a376-82d5439a0208" path="/var/lib/kubelet/pods/efe36576-8433-45b2-a376-82d5439a0208/volumes" Mar 11 02:36:02 crc kubenswrapper[4744]: I0311 02:36:02.532575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" event={"ID":"d75d4a39-c076-4584-9ede-73f57c608141","Type":"ContainerStarted","Data":"106243b38d36c0556276a177f8344dfc32251b074df1cdf9839fde5e6f424e69"} Mar 11 02:36:02 crc kubenswrapper[4744]: I0311 02:36:02.555902 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" podStartSLOduration=1.609862446 podStartE2EDuration="2.55585122s" podCreationTimestamp="2026-03-11 02:36:00 +0000 UTC" firstStartedPulling="2026-03-11 02:36:01.043841234 +0000 UTC m=+6117.848058879" lastFinishedPulling="2026-03-11 02:36:01.989830018 +0000 UTC m=+6118.794047653" observedRunningTime="2026-03-11 02:36:02.548168104 +0000 UTC m=+6119.352385769" watchObservedRunningTime="2026-03-11 02:36:02.55585122 +0000 UTC m=+6119.360068825" Mar 11 02:36:03 crc kubenswrapper[4744]: I0311 02:36:03.544944 4744 generic.go:334] "Generic (PLEG): container finished" podID="d75d4a39-c076-4584-9ede-73f57c608141" containerID="106243b38d36c0556276a177f8344dfc32251b074df1cdf9839fde5e6f424e69" exitCode=0 Mar 11 02:36:03 crc kubenswrapper[4744]: I0311 02:36:03.545013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" event={"ID":"d75d4a39-c076-4584-9ede-73f57c608141","Type":"ContainerDied","Data":"106243b38d36c0556276a177f8344dfc32251b074df1cdf9839fde5e6f424e69"} Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.012803 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.052957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g9bz\" (UniqueName: \"kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz\") pod \"d75d4a39-c076-4584-9ede-73f57c608141\" (UID: \"d75d4a39-c076-4584-9ede-73f57c608141\") " Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.061760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz" (OuterVolumeSpecName: "kube-api-access-8g9bz") pod "d75d4a39-c076-4584-9ede-73f57c608141" (UID: "d75d4a39-c076-4584-9ede-73f57c608141"). InnerVolumeSpecName "kube-api-access-8g9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.156238 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g9bz\" (UniqueName: \"kubernetes.io/projected/d75d4a39-c076-4584-9ede-73f57c608141-kube-api-access-8g9bz\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.572972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" event={"ID":"d75d4a39-c076-4584-9ede-73f57c608141","Type":"ContainerDied","Data":"687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3"} Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.573038 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687912f208b79596dab5370b7224e696d1e3041b43903e6699250440153d1cb3" Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.573122 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553276-rgjfz" Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.634040 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553270-xv2sr"] Mar 11 02:36:05 crc kubenswrapper[4744]: I0311 02:36:05.646013 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553270-xv2sr"] Mar 11 02:36:06 crc kubenswrapper[4744]: I0311 02:36:06.004461 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0055e672-0093-4766-ad33-335b68aec14c" path="/var/lib/kubelet/pods/0055e672-0093-4766-ad33-335b68aec14c/volumes" Mar 11 02:36:14 crc kubenswrapper[4744]: I0311 02:36:14.049815 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2w2gq"] Mar 11 02:36:14 crc kubenswrapper[4744]: I0311 02:36:14.065272 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2w2gq"] Mar 11 02:36:15 crc kubenswrapper[4744]: I0311 02:36:15.988983 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e325c578-6909-4a00-8a16-436f430a8071" path="/var/lib/kubelet/pods/e325c578-6909-4a00-8a16-436f430a8071/volumes" Mar 11 02:36:17 crc kubenswrapper[4744]: I0311 02:36:17.580630 4744 scope.go:117] "RemoveContainer" containerID="664d51ea20ee21d04086ff13d1c2201d4a90c51e74b13ee455c5ba70d494c948" Mar 11 02:36:17 crc kubenswrapper[4744]: I0311 02:36:17.645886 4744 scope.go:117] "RemoveContainer" containerID="1a359cfcbdd2e0ac830ef649745ebc3c5a8e9453572a20dd943f035fbacac5e6" Mar 11 02:36:17 crc kubenswrapper[4744]: I0311 02:36:17.684594 4744 scope.go:117] "RemoveContainer" containerID="a7289668482f737751be18b35d1ae34a24da27d977aa5399f17716f7c760cc89" Mar 11 02:36:17 crc kubenswrapper[4744]: I0311 02:36:17.762436 4744 scope.go:117] "RemoveContainer" containerID="41d0437c40e87b1ae6feeddb4b53a62fdab0700f2af5b394699655107306d7cb" Mar 11 02:36:17 crc kubenswrapper[4744]: I0311 02:36:17.788839 4744 scope.go:117] "RemoveContainer" containerID="1d75cc4fc5140c5d8674c518d3ebefbaeb41513facbec9f5626527ce0c40a28e" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.713212 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:23 crc kubenswrapper[4744]: E0311 02:36:23.715116 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75d4a39-c076-4584-9ede-73f57c608141" containerName="oc" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.715156 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75d4a39-c076-4584-9ede-73f57c608141" containerName="oc" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.715469 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75d4a39-c076-4584-9ede-73f57c608141" containerName="oc" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.717678 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.738926 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.778133 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.778294 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.778353 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f489p\" (UniqueName: \"kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.880441 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.880631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.880702 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f489p\" (UniqueName: \"kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.881297 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.881400 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:23 crc kubenswrapper[4744]: I0311 02:36:23.909014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f489p\" (UniqueName: \"kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p\") pod \"redhat-operators-6wd4b\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:24 crc kubenswrapper[4744]: I0311 02:36:24.042809 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:24 crc kubenswrapper[4744]: I0311 02:36:24.287881 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:24 crc kubenswrapper[4744]: I0311 02:36:24.491216 4744 generic.go:334] "Generic (PLEG): container finished" podID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerID="3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab" exitCode=0 Mar 11 02:36:24 crc kubenswrapper[4744]: I0311 02:36:24.491263 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerDied","Data":"3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab"} Mar 11 02:36:24 crc kubenswrapper[4744]: I0311 02:36:24.491289 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerStarted","Data":"793a90cf1ba9f4b0729491163278fff2c7d341974ac6a066d4b63dba2e067c8a"} Mar 11 02:36:25 crc kubenswrapper[4744]: I0311 02:36:25.502914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerStarted","Data":"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e"} Mar 11 02:36:26 crc kubenswrapper[4744]: I0311 02:36:26.516919 4744 generic.go:334] "Generic (PLEG): container finished" podID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerID="1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e" exitCode=0 Mar 11 02:36:26 crc kubenswrapper[4744]: I0311 02:36:26.516985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerDied","Data":"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e"} Mar 11 02:36:27 crc kubenswrapper[4744]: I0311 02:36:27.526871 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerStarted","Data":"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65"} Mar 11 02:36:27 crc kubenswrapper[4744]: I0311 02:36:27.547822 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6wd4b" podStartSLOduration=2.113932448 podStartE2EDuration="4.547801685s" podCreationTimestamp="2026-03-11 02:36:23 +0000 UTC" firstStartedPulling="2026-03-11 02:36:24.493005595 +0000 UTC m=+6141.297223200" lastFinishedPulling="2026-03-11 02:36:26.926874802 +0000 UTC m=+6143.731092437" observedRunningTime="2026-03-11 02:36:27.546890466 +0000 UTC m=+6144.351108111" watchObservedRunningTime="2026-03-11 02:36:27.547801685 +0000 UTC m=+6144.352019300" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.508091 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.511373 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.516766 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.570797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.570990 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.571133 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg4l\" (UniqueName: \"kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.672394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg4l\" (UniqueName: \"kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.672748 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.672940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.673195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.673316 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.693196 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg4l\" (UniqueName: \"kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l\") pod \"community-operators-25d8t\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:28 crc kubenswrapper[4744]: I0311 02:36:28.841227 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.327232 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:29 crc kubenswrapper[4744]: W0311 02:36:29.334803 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6175e8_ba3c_43b0_8093_78e9f7d25ae5.slice/crio-ad04ec5cda119f58c652cc88499d581f056b2a2b8d5c4bbf4dc856ec82d7d5ad WatchSource:0}: Error finding container ad04ec5cda119f58c652cc88499d581f056b2a2b8d5c4bbf4dc856ec82d7d5ad: Status 404 returned error can't find the container with id ad04ec5cda119f58c652cc88499d581f056b2a2b8d5c4bbf4dc856ec82d7d5ad Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.500376 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.503016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.549224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.579475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerStarted","Data":"ad04ec5cda119f58c652cc88499d581f056b2a2b8d5c4bbf4dc856ec82d7d5ad"} Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.691547 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.691662 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242hv\" (UniqueName: \"kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.691737 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.793404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242hv\" (UniqueName: \"kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.793479 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.793532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.793933 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.794113 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.829406 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242hv\" (UniqueName: \"kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv\") pod \"redhat-marketplace-xl9gl\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:29 crc kubenswrapper[4744]: I0311 02:36:29.856057 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.341197 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.588096 4744 generic.go:334] "Generic (PLEG): container finished" podID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerID="445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0" exitCode=0 Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.588163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerDied","Data":"445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0"} Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.592000 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerID="28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9" exitCode=0 Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.592046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerDied","Data":"28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9"} Mar 11 02:36:30 crc kubenswrapper[4744]: I0311 02:36:30.592075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerStarted","Data":"513a80b1dfdda0e9b8c86f62307bec9ee63c2c8d73fddbdcfa6e10fd1973c6f3"} Mar 11 02:36:32 crc kubenswrapper[4744]: I0311 02:36:32.624249 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerID="d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db" exitCode=0 Mar 11 02:36:32 crc kubenswrapper[4744]: I0311 02:36:32.624384 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerDied","Data":"d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db"} Mar 11 02:36:32 crc kubenswrapper[4744]: I0311 02:36:32.628145 4744 generic.go:334] "Generic (PLEG): container finished" podID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerID="e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9" exitCode=0 Mar 11 02:36:32 crc kubenswrapper[4744]: I0311 02:36:32.628209 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerDied","Data":"e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9"} Mar 11 02:36:33 crc kubenswrapper[4744]: I0311 02:36:33.638558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerStarted","Data":"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f"} Mar 11 02:36:33 crc kubenswrapper[4744]: I0311 02:36:33.640906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerStarted","Data":"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0"} Mar 11 02:36:33 crc kubenswrapper[4744]: I0311 02:36:33.667294 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xl9gl" podStartSLOduration=2.226899912 podStartE2EDuration="4.667281208s" podCreationTimestamp="2026-03-11 02:36:29 +0000 UTC" firstStartedPulling="2026-03-11 02:36:30.593727141 +0000 UTC m=+6147.397944756" lastFinishedPulling="2026-03-11 02:36:33.034108407 +0000 UTC m=+6149.838326052" observedRunningTime="2026-03-11 02:36:33.665377809 +0000 UTC m=+6150.469595414" watchObservedRunningTime="2026-03-11 02:36:33.667281208 +0000 UTC m=+6150.471498813" Mar 11 02:36:33 crc kubenswrapper[4744]: I0311 02:36:33.696858 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25d8t" podStartSLOduration=3.042432789 podStartE2EDuration="5.696831237s" podCreationTimestamp="2026-03-11 02:36:28 +0000 UTC" firstStartedPulling="2026-03-11 02:36:30.589906793 +0000 UTC m=+6147.394124408" lastFinishedPulling="2026-03-11 02:36:33.244305241 +0000 UTC m=+6150.048522856" observedRunningTime="2026-03-11 02:36:33.684403955 +0000 UTC m=+6150.488621560" watchObservedRunningTime="2026-03-11 02:36:33.696831237 +0000 UTC m=+6150.501048862" Mar 11 02:36:34 crc kubenswrapper[4744]: I0311 02:36:34.043005 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:34 crc kubenswrapper[4744]: I0311 02:36:34.043273 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:35 crc kubenswrapper[4744]: I0311 02:36:35.098527 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6wd4b" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="registry-server" probeResult="failure" output=< Mar 11 02:36:35 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 02:36:35 crc kubenswrapper[4744]: > Mar 11 02:36:38 crc kubenswrapper[4744]: I0311 02:36:38.842324 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:38 crc kubenswrapper[4744]: I0311 02:36:38.842630 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:38 crc kubenswrapper[4744]: I0311 02:36:38.901097 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:39 crc kubenswrapper[4744]: I0311 02:36:39.781183 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:39 crc kubenswrapper[4744]: I0311 02:36:39.856857 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:39 crc kubenswrapper[4744]: I0311 02:36:39.856928 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:39 crc kubenswrapper[4744]: I0311 02:36:39.933674 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:40 crc kubenswrapper[4744]: I0311 02:36:40.773541 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:41 crc kubenswrapper[4744]: I0311 02:36:41.288798 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:41 crc kubenswrapper[4744]: I0311 02:36:41.716960 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25d8t" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="registry-server" containerID="cri-o://07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0" gracePeriod=2 Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.260975 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.275900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content\") pod \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.276271 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlg4l\" (UniqueName: \"kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l\") pod \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.276712 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities\") pod \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\" (UID: \"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5\") " Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.279717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities" (OuterVolumeSpecName: "utilities") pod "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" (UID: "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.285867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l" (OuterVolumeSpecName: "kube-api-access-zlg4l") pod "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" (UID: "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5"). InnerVolumeSpecName "kube-api-access-zlg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.295604 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.349138 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" (UID: "fc6175e8-ba3c-43b0-8093-78e9f7d25ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.379280 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.379307 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlg4l\" (UniqueName: \"kubernetes.io/projected/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-kube-api-access-zlg4l\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.379317 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.408720 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.408799 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.729995 4744 generic.go:334] "Generic (PLEG): container finished" podID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerID="07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0" exitCode=0 Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.730074 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25d8t" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.730119 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerDied","Data":"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0"} Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.730538 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25d8t" event={"ID":"fc6175e8-ba3c-43b0-8093-78e9f7d25ae5","Type":"ContainerDied","Data":"ad04ec5cda119f58c652cc88499d581f056b2a2b8d5c4bbf4dc856ec82d7d5ad"} Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.730552 4744 scope.go:117] "RemoveContainer" containerID="07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.731078 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xl9gl" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="registry-server" containerID="cri-o://e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f" gracePeriod=2 Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.768043 4744 scope.go:117] "RemoveContainer" containerID="e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.784337 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.795959 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25d8t"] Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.833057 4744 scope.go:117] "RemoveContainer" containerID="445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.920050 4744 scope.go:117] "RemoveContainer" containerID="07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0" Mar 11 02:36:42 crc kubenswrapper[4744]: E0311 02:36:42.920456 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0\": container with ID starting with 07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0 not found: ID does not exist" containerID="07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.920493 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0"} err="failed to get container status \"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0\": rpc error: code = NotFound desc = could not find container \"07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0\": container with ID starting with 07d891ec38456a7b3eb4c39669848a913aa3843cb102b41959c8e94520efd2e0 not found: ID does not exist" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.920539 4744 scope.go:117] "RemoveContainer" containerID="e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9" Mar 11 02:36:42 crc kubenswrapper[4744]: E0311 02:36:42.920759 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9\": container with ID starting with e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9 not found: ID does not exist" containerID="e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.920785 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9"} err="failed to get container status \"e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9\": rpc error: code = NotFound desc = could not find container \"e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9\": container with ID starting with e8056fa0a97080aac7e03ffbc541e2897c161765807742e512ec68610a9707b9 not found: ID does not exist" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.920802 4744 scope.go:117] "RemoveContainer" containerID="445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0" Mar 11 02:36:42 crc kubenswrapper[4744]: E0311 02:36:42.921041 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0\": container with ID starting with 445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0 not found: ID does not exist" containerID="445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0" Mar 11 02:36:42 crc kubenswrapper[4744]: I0311 02:36:42.921063 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0"} err="failed to get container status \"445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0\": rpc error: code = NotFound desc = could not find container \"445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0\": container with ID starting with 445fbdc740f654829827dd8b2f6bf203e5812b4cd7e0bc8f86f9e345e6950eb0 not found: ID does not exist" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.257713 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.300373 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities\") pod \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.300544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-242hv\" (UniqueName: \"kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv\") pod \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.300617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content\") pod \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\" (UID: \"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978\") " Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.301275 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities" (OuterVolumeSpecName: "utilities") pod "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" (UID: "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.307433 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv" (OuterVolumeSpecName: "kube-api-access-242hv") pod "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" (UID: "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978"). InnerVolumeSpecName "kube-api-access-242hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.326445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" (UID: "5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.401615 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.401652 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-242hv\" (UniqueName: \"kubernetes.io/projected/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-kube-api-access-242hv\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.401663 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.746139 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xl9gl" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.746124 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerID="e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f" exitCode=0 Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.746173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerDied","Data":"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f"} Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.747211 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xl9gl" event={"ID":"5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978","Type":"ContainerDied","Data":"513a80b1dfdda0e9b8c86f62307bec9ee63c2c8d73fddbdcfa6e10fd1973c6f3"} Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.747262 4744 scope.go:117] "RemoveContainer" containerID="e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.787735 4744 scope.go:117] "RemoveContainer" containerID="d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.796313 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.801869 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xl9gl"] Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.849486 4744 scope.go:117] "RemoveContainer" containerID="28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.885847 4744 scope.go:117] "RemoveContainer" containerID="e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f" Mar 11 02:36:43 crc kubenswrapper[4744]: E0311 02:36:43.886598 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f\": container with ID starting with e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f not found: ID does not exist" containerID="e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.886697 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f"} err="failed to get container status \"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f\": rpc error: code = NotFound desc = could not find container \"e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f\": container with ID starting with e706d0987d5b76dee43346076b118868cbca3aed2bc98b7ac5fd895f1949088f not found: ID does not exist" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.886739 4744 scope.go:117] "RemoveContainer" containerID="d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db" Mar 11 02:36:43 crc kubenswrapper[4744]: E0311 02:36:43.887334 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db\": container with ID starting with d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db not found: ID does not exist" containerID="d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.887390 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db"} err="failed to get container status \"d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db\": rpc error: code = NotFound desc = could not find container \"d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db\": container with ID starting with d33eb3a2591ca261bd9a103f5ee278754016fbacb973d0c322c9133ec14a79db not found: ID does not exist" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.887424 4744 scope.go:117] "RemoveContainer" containerID="28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9" Mar 11 02:36:43 crc kubenswrapper[4744]: E0311 02:36:43.887913 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9\": container with ID starting with 28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9 not found: ID does not exist" containerID="28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.888028 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9"} err="failed to get container status \"28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9\": rpc error: code = NotFound desc = could not find container \"28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9\": container with ID starting with 28a718fecd509eb3dd33eb659f91085edb18119e49c6b9aeeaa0426583daeeb9 not found: ID does not exist" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.986766 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" path="/var/lib/kubelet/pods/5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978/volumes" Mar 11 02:36:43 crc kubenswrapper[4744]: I0311 02:36:43.987536 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" path="/var/lib/kubelet/pods/fc6175e8-ba3c-43b0-8093-78e9f7d25ae5/volumes" Mar 11 02:36:44 crc kubenswrapper[4744]: I0311 02:36:44.116725 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:44 crc kubenswrapper[4744]: I0311 02:36:44.200733 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:46 crc kubenswrapper[4744]: I0311 02:36:46.689307 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:46 crc kubenswrapper[4744]: I0311 02:36:46.690018 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6wd4b" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="registry-server" containerID="cri-o://24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65" gracePeriod=2 Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.204883 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.406346 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content\") pod \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.407870 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f489p\" (UniqueName: \"kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p\") pod \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.408135 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities\") pod \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\" (UID: \"39e3dc3b-a6db-41ed-98ca-eb381ead17ec\") " Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.411392 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities" (OuterVolumeSpecName: "utilities") pod "39e3dc3b-a6db-41ed-98ca-eb381ead17ec" (UID: "39e3dc3b-a6db-41ed-98ca-eb381ead17ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.416811 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p" (OuterVolumeSpecName: "kube-api-access-f489p") pod "39e3dc3b-a6db-41ed-98ca-eb381ead17ec" (UID: "39e3dc3b-a6db-41ed-98ca-eb381ead17ec"). InnerVolumeSpecName "kube-api-access-f489p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.510253 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f489p\" (UniqueName: \"kubernetes.io/projected/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-kube-api-access-f489p\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.510309 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.613180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39e3dc3b-a6db-41ed-98ca-eb381ead17ec" (UID: "39e3dc3b-a6db-41ed-98ca-eb381ead17ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.714090 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e3dc3b-a6db-41ed-98ca-eb381ead17ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.798759 4744 generic.go:334] "Generic (PLEG): container finished" podID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerID="24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65" exitCode=0 Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.798811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerDied","Data":"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65"} Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.798891 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wd4b" event={"ID":"39e3dc3b-a6db-41ed-98ca-eb381ead17ec","Type":"ContainerDied","Data":"793a90cf1ba9f4b0729491163278fff2c7d341974ac6a066d4b63dba2e067c8a"} Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.798889 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wd4b" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.798920 4744 scope.go:117] "RemoveContainer" containerID="24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.834376 4744 scope.go:117] "RemoveContainer" containerID="1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.850903 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.857698 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6wd4b"] Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.870647 4744 scope.go:117] "RemoveContainer" containerID="3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.913716 4744 scope.go:117] "RemoveContainer" containerID="24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65" Mar 11 02:36:47 crc kubenswrapper[4744]: E0311 02:36:47.914398 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65\": container with ID starting with 24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65 not found: ID does not exist" containerID="24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.914436 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65"} err="failed to get container status \"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65\": rpc error: code = NotFound desc = could not find container \"24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65\": container with ID starting with 24cab1151bdba77c20cccbc577c1047fb79a4af027ce02afd3fb233426e37e65 not found: ID does not exist" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.914461 4744 scope.go:117] "RemoveContainer" containerID="1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e" Mar 11 02:36:47 crc kubenswrapper[4744]: E0311 02:36:47.914851 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e\": container with ID starting with 1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e not found: ID does not exist" containerID="1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.914882 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e"} err="failed to get container status \"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e\": rpc error: code = NotFound desc = could not find container \"1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e\": container with ID starting with 1d726c244b4a5f967bede3007287fcb39202f954516f06e3f02ffa6407ff142e not found: ID does not exist" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.914899 4744 scope.go:117] "RemoveContainer" containerID="3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab" Mar 11 02:36:47 crc kubenswrapper[4744]: E0311 02:36:47.915429 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab\": container with ID starting with 3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab not found: ID does not exist" containerID="3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.915465 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab"} err="failed to get container status \"3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab\": rpc error: code = NotFound desc = could not find container \"3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab\": container with ID starting with 3ea67a89d822531e67a43f66fe05980f38d3770e4b629c9029bd4fd457c537ab not found: ID does not exist" Mar 11 02:36:47 crc kubenswrapper[4744]: I0311 02:36:47.987656 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" path="/var/lib/kubelet/pods/39e3dc3b-a6db-41ed-98ca-eb381ead17ec/volumes" Mar 11 02:37:12 crc kubenswrapper[4744]: I0311 02:37:12.409299 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:37:12 crc kubenswrapper[4744]: I0311 02:37:12.409840 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:37:42 crc kubenswrapper[4744]: I0311 02:37:42.409451 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:37:42 crc kubenswrapper[4744]: I0311 02:37:42.410185 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:37:42 crc kubenswrapper[4744]: I0311 02:37:42.410244 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:37:42 crc kubenswrapper[4744]: I0311 02:37:42.410957 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:37:42 crc kubenswrapper[4744]: I0311 02:37:42.411044 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f" gracePeriod=600 Mar 11 02:37:43 crc kubenswrapper[4744]: I0311 02:37:43.389606 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f" exitCode=0 Mar 11 02:37:43 crc kubenswrapper[4744]: I0311 02:37:43.389683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f"} Mar 11 02:37:43 crc kubenswrapper[4744]: I0311 02:37:43.390353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7"} Mar 11 02:37:43 crc kubenswrapper[4744]: I0311 02:37:43.390386 4744 scope.go:117] "RemoveContainer" containerID="0c6d2088731104df66ba686648eb351061ccd27cb0c0c1c48cf7ba6629e29690" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.223384 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553278-6hcqx"] Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224719 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224743 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224773 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224788 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224815 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224829 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224847 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224861 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224879 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224891 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224910 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224922 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224947 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224959 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="extract-utilities" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.224978 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.224990 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: E0311 02:38:00.225010 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.225021 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="extract-content" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.225302 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6175e8-ba3c-43b0-8093-78e9f7d25ae5" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.225332 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e3dc3b-a6db-41ed-98ca-eb381ead17ec" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.225365 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9bfc2e-0f8e-4e89-982b-16d1c7ec5978" containerName="registry-server" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.226419 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.228987 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.231353 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.231559 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.232926 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553278-6hcqx"] Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.302003 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkkd\" (UniqueName: \"kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd\") pod \"auto-csr-approver-29553278-6hcqx\" (UID: \"2438c214-5451-4ee6-b516-801897f7afc3\") " pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.403995 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkkd\" (UniqueName: \"kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd\") pod \"auto-csr-approver-29553278-6hcqx\" (UID: \"2438c214-5451-4ee6-b516-801897f7afc3\") " pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.445684 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkkd\" (UniqueName: \"kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd\") pod \"auto-csr-approver-29553278-6hcqx\" (UID: \"2438c214-5451-4ee6-b516-801897f7afc3\") " pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.591217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.906337 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553278-6hcqx"] Mar 11 02:38:00 crc kubenswrapper[4744]: W0311 02:38:00.910274 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2438c214_5451_4ee6_b516_801897f7afc3.slice/crio-40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217 WatchSource:0}: Error finding container 40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217: Status 404 returned error can't find the container with id 40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217 Mar 11 02:38:00 crc kubenswrapper[4744]: I0311 02:38:00.914937 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:38:01 crc kubenswrapper[4744]: I0311 02:38:01.570160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" event={"ID":"2438c214-5451-4ee6-b516-801897f7afc3","Type":"ContainerStarted","Data":"40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217"} Mar 11 02:38:02 crc kubenswrapper[4744]: I0311 02:38:02.587769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" event={"ID":"2438c214-5451-4ee6-b516-801897f7afc3","Type":"ContainerStarted","Data":"3d9a88545c4ec5eb96d01c5266e0505f2cae6f7420ea65516225785398fa1d80"} Mar 11 02:38:02 crc kubenswrapper[4744]: I0311 02:38:02.623117 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" podStartSLOduration=1.587819418 podStartE2EDuration="2.623090713s" podCreationTimestamp="2026-03-11 02:38:00 +0000 UTC" firstStartedPulling="2026-03-11 02:38:00.914661257 +0000 UTC m=+6237.718878872" lastFinishedPulling="2026-03-11 02:38:01.949932532 +0000 UTC m=+6238.754150167" observedRunningTime="2026-03-11 02:38:02.612466786 +0000 UTC m=+6239.416684401" watchObservedRunningTime="2026-03-11 02:38:02.623090713 +0000 UTC m=+6239.427308338" Mar 11 02:38:03 crc kubenswrapper[4744]: I0311 02:38:03.602901 4744 generic.go:334] "Generic (PLEG): container finished" podID="2438c214-5451-4ee6-b516-801897f7afc3" containerID="3d9a88545c4ec5eb96d01c5266e0505f2cae6f7420ea65516225785398fa1d80" exitCode=0 Mar 11 02:38:03 crc kubenswrapper[4744]: I0311 02:38:03.602960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" event={"ID":"2438c214-5451-4ee6-b516-801897f7afc3","Type":"ContainerDied","Data":"3d9a88545c4ec5eb96d01c5266e0505f2cae6f7420ea65516225785398fa1d80"} Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.095722 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.208004 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfkkd\" (UniqueName: \"kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd\") pod \"2438c214-5451-4ee6-b516-801897f7afc3\" (UID: \"2438c214-5451-4ee6-b516-801897f7afc3\") " Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.222137 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd" (OuterVolumeSpecName: "kube-api-access-dfkkd") pod "2438c214-5451-4ee6-b516-801897f7afc3" (UID: "2438c214-5451-4ee6-b516-801897f7afc3"). InnerVolumeSpecName "kube-api-access-dfkkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.310451 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfkkd\" (UniqueName: \"kubernetes.io/projected/2438c214-5451-4ee6-b516-801897f7afc3-kube-api-access-dfkkd\") on node \"crc\" DevicePath \"\"" Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.630332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" event={"ID":"2438c214-5451-4ee6-b516-801897f7afc3","Type":"ContainerDied","Data":"40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217"} Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.630368 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553278-6hcqx" Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.630428 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217" Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.699804 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553272-8f29r"] Mar 11 02:38:05 crc kubenswrapper[4744]: I0311 02:38:05.709129 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553272-8f29r"] Mar 11 02:38:05 crc kubenswrapper[4744]: E0311 02:38:05.793178 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2438c214_5451_4ee6_b516_801897f7afc3.slice/crio-40e3800b2c277217b1809e472461dc416719c0e0495c21068922e4b8d045d217\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2438c214_5451_4ee6_b516_801897f7afc3.slice\": RecentStats: unable to find data in memory cache]" Mar 11 02:38:06 crc kubenswrapper[4744]: I0311 02:38:06.002821 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c3971f-b748-4eb0-a902-c0c87c4e8186" path="/var/lib/kubelet/pods/58c3971f-b748-4eb0-a902-c0c87c4e8186/volumes" Mar 11 02:38:17 crc kubenswrapper[4744]: I0311 02:38:17.995284 4744 scope.go:117] "RemoveContainer" containerID="6fb87143ba994c47523a11bfa21d928076ef019c5d15b0cbc3c8ff9439f7c502" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.559727 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:38:52 crc kubenswrapper[4744]: E0311 02:38:52.560427 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438c214-5451-4ee6-b516-801897f7afc3" containerName="oc" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.560439 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438c214-5451-4ee6-b516-801897f7afc3" containerName="oc" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.560615 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2438c214-5451-4ee6-b516-801897f7afc3" containerName="oc" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.561663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.581488 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.675238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.675334 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.675609 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvvcm\" (UniqueName: \"kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.776971 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvvcm\" (UniqueName: \"kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.777053 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.777118 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.777609 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.777887 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.807782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvvcm\" (UniqueName: \"kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm\") pod \"certified-operators-ksj8s\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:52 crc kubenswrapper[4744]: I0311 02:38:52.878619 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:38:53 crc kubenswrapper[4744]: I0311 02:38:53.201792 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:38:54 crc kubenswrapper[4744]: I0311 02:38:54.164769 4744 generic.go:334] "Generic (PLEG): container finished" podID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerID="da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6" exitCode=0 Mar 11 02:38:54 crc kubenswrapper[4744]: I0311 02:38:54.164863 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerDied","Data":"da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6"} Mar 11 02:38:54 crc kubenswrapper[4744]: I0311 02:38:54.165162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerStarted","Data":"02437365e2b39868134beeab20dba77d933ca85df59a24ce42ddbb863eea97b3"} Mar 11 02:38:55 crc kubenswrapper[4744]: I0311 02:38:55.180478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerStarted","Data":"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf"} Mar 11 02:38:56 crc kubenswrapper[4744]: I0311 02:38:56.194283 4744 generic.go:334] "Generic (PLEG): container finished" podID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerID="378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf" exitCode=0 Mar 11 02:38:56 crc kubenswrapper[4744]: I0311 02:38:56.194347 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerDied","Data":"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf"} Mar 11 02:38:57 crc kubenswrapper[4744]: I0311 02:38:57.209373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerStarted","Data":"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12"} Mar 11 02:38:57 crc kubenswrapper[4744]: I0311 02:38:57.232858 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ksj8s" podStartSLOduration=2.752603263 podStartE2EDuration="5.232838467s" podCreationTimestamp="2026-03-11 02:38:52 +0000 UTC" firstStartedPulling="2026-03-11 02:38:54.167560075 +0000 UTC m=+6290.971777710" lastFinishedPulling="2026-03-11 02:38:56.647795269 +0000 UTC m=+6293.452012914" observedRunningTime="2026-03-11 02:38:57.224009345 +0000 UTC m=+6294.028226950" watchObservedRunningTime="2026-03-11 02:38:57.232838467 +0000 UTC m=+6294.037056082" Mar 11 02:39:02 crc kubenswrapper[4744]: I0311 02:39:02.879263 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:02 crc kubenswrapper[4744]: I0311 02:39:02.880167 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:02 crc kubenswrapper[4744]: I0311 02:39:02.959310 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:03 crc kubenswrapper[4744]: I0311 02:39:03.308863 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:03 crc kubenswrapper[4744]: I0311 02:39:03.358937 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.293948 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ksj8s" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="registry-server" containerID="cri-o://0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12" gracePeriod=2 Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.852721 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.872968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvvcm\" (UniqueName: \"kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm\") pod \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.873087 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content\") pod \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.873193 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities\") pod \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\" (UID: \"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9\") " Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.875111 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities" (OuterVolumeSpecName: "utilities") pod "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" (UID: "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.883576 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm" (OuterVolumeSpecName: "kube-api-access-fvvcm") pod "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" (UID: "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9"). InnerVolumeSpecName "kube-api-access-fvvcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.954950 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" (UID: "8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.976504 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.976538 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvvcm\" (UniqueName: \"kubernetes.io/projected/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-kube-api-access-fvvcm\") on node \"crc\" DevicePath \"\"" Mar 11 02:39:05 crc kubenswrapper[4744]: I0311 02:39:05.976549 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.302379 4744 generic.go:334] "Generic (PLEG): container finished" podID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerID="0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12" exitCode=0 Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.302428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerDied","Data":"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12"} Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.302461 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksj8s" event={"ID":"8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9","Type":"ContainerDied","Data":"02437365e2b39868134beeab20dba77d933ca85df59a24ce42ddbb863eea97b3"} Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.302462 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksj8s" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.302509 4744 scope.go:117] "RemoveContainer" containerID="0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.324148 4744 scope.go:117] "RemoveContainer" containerID="378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.328350 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.340029 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ksj8s"] Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.352781 4744 scope.go:117] "RemoveContainer" containerID="da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.405283 4744 scope.go:117] "RemoveContainer" containerID="0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12" Mar 11 02:39:06 crc kubenswrapper[4744]: E0311 02:39:06.406140 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12\": container with ID starting with 0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12 not found: ID does not exist" containerID="0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.406199 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12"} err="failed to get container status \"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12\": rpc error: code = NotFound desc = could not find container \"0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12\": container with ID starting with 0babeac9dfc8a38b25f9d4b082218ac5e7e22479f00d478d01ee21a789372b12 not found: ID does not exist" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.406253 4744 scope.go:117] "RemoveContainer" containerID="378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf" Mar 11 02:39:06 crc kubenswrapper[4744]: E0311 02:39:06.406751 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf\": container with ID starting with 378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf not found: ID does not exist" containerID="378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.406787 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf"} err="failed to get container status \"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf\": rpc error: code = NotFound desc = could not find container \"378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf\": container with ID starting with 378ef1addedbf879c0987b8a535e637738766b209955184cd4b8294f1ec15bdf not found: ID does not exist" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.406812 4744 scope.go:117] "RemoveContainer" containerID="da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6" Mar 11 02:39:06 crc kubenswrapper[4744]: E0311 02:39:06.407278 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6\": container with ID starting with da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6 not found: ID does not exist" containerID="da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6" Mar 11 02:39:06 crc kubenswrapper[4744]: I0311 02:39:06.407354 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6"} err="failed to get container status \"da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6\": rpc error: code = NotFound desc = could not find container \"da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6\": container with ID starting with da1780dca9248382d57a8bdb110d5e57ea414c057217977aa4160777b7c4a3b6 not found: ID does not exist" Mar 11 02:39:07 crc kubenswrapper[4744]: I0311 02:39:07.991000 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" path="/var/lib/kubelet/pods/8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9/volumes" Mar 11 02:39:42 crc kubenswrapper[4744]: I0311 02:39:42.409481 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:39:42 crc kubenswrapper[4744]: I0311 02:39:42.410253 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.164414 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553280-44mf9"] Mar 11 02:40:00 crc kubenswrapper[4744]: E0311 02:40:00.165630 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="extract-utilities" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.165649 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="extract-utilities" Mar 11 02:40:00 crc kubenswrapper[4744]: E0311 02:40:00.165669 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="extract-content" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.165677 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="extract-content" Mar 11 02:40:00 crc kubenswrapper[4744]: E0311 02:40:00.165708 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="registry-server" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.165717 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="registry-server" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.165979 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8075cdf9-4ecf-4fbf-9fcd-e96bed3d0ea9" containerName="registry-server" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.166758 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.171256 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.172208 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.174208 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.179429 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553280-44mf9"] Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.278921 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzg6n\" (UniqueName: \"kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n\") pod \"auto-csr-approver-29553280-44mf9\" (UID: \"5e700547-ace4-49e2-b216-0fd80fcb915a\") " pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.381080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzg6n\" (UniqueName: \"kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n\") pod \"auto-csr-approver-29553280-44mf9\" (UID: \"5e700547-ace4-49e2-b216-0fd80fcb915a\") " pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.428151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzg6n\" (UniqueName: \"kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n\") pod \"auto-csr-approver-29553280-44mf9\" (UID: \"5e700547-ace4-49e2-b216-0fd80fcb915a\") " pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:00 crc kubenswrapper[4744]: I0311 02:40:00.488114 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:01 crc kubenswrapper[4744]: I0311 02:40:01.029170 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553280-44mf9"] Mar 11 02:40:01 crc kubenswrapper[4744]: W0311 02:40:01.034091 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e700547_ace4_49e2_b216_0fd80fcb915a.slice/crio-8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052 WatchSource:0}: Error finding container 8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052: Status 404 returned error can't find the container with id 8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052 Mar 11 02:40:01 crc kubenswrapper[4744]: I0311 02:40:01.918098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553280-44mf9" event={"ID":"5e700547-ace4-49e2-b216-0fd80fcb915a","Type":"ContainerStarted","Data":"8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052"} Mar 11 02:40:02 crc kubenswrapper[4744]: I0311 02:40:02.932636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553280-44mf9" event={"ID":"5e700547-ace4-49e2-b216-0fd80fcb915a","Type":"ContainerStarted","Data":"17da550c8b0e104ba33c462670009067131bb69dd24e6d3e97177480cd2e832f"} Mar 11 02:40:02 crc kubenswrapper[4744]: I0311 02:40:02.951507 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553280-44mf9" podStartSLOduration=1.498979833 podStartE2EDuration="2.951476399s" podCreationTimestamp="2026-03-11 02:40:00 +0000 UTC" firstStartedPulling="2026-03-11 02:40:01.037958451 +0000 UTC m=+6357.842176096" lastFinishedPulling="2026-03-11 02:40:02.490455057 +0000 UTC m=+6359.294672662" observedRunningTime="2026-03-11 02:40:02.948634792 +0000 UTC m=+6359.752852427" watchObservedRunningTime="2026-03-11 02:40:02.951476399 +0000 UTC m=+6359.755694044" Mar 11 02:40:03 crc kubenswrapper[4744]: I0311 02:40:03.944332 4744 generic.go:334] "Generic (PLEG): container finished" podID="5e700547-ace4-49e2-b216-0fd80fcb915a" containerID="17da550c8b0e104ba33c462670009067131bb69dd24e6d3e97177480cd2e832f" exitCode=0 Mar 11 02:40:03 crc kubenswrapper[4744]: I0311 02:40:03.944456 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553280-44mf9" event={"ID":"5e700547-ace4-49e2-b216-0fd80fcb915a","Type":"ContainerDied","Data":"17da550c8b0e104ba33c462670009067131bb69dd24e6d3e97177480cd2e832f"} Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.439614 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.488612 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzg6n\" (UniqueName: \"kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n\") pod \"5e700547-ace4-49e2-b216-0fd80fcb915a\" (UID: \"5e700547-ace4-49e2-b216-0fd80fcb915a\") " Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.496586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n" (OuterVolumeSpecName: "kube-api-access-zzg6n") pod "5e700547-ace4-49e2-b216-0fd80fcb915a" (UID: "5e700547-ace4-49e2-b216-0fd80fcb915a"). InnerVolumeSpecName "kube-api-access-zzg6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.592842 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzg6n\" (UniqueName: \"kubernetes.io/projected/5e700547-ace4-49e2-b216-0fd80fcb915a-kube-api-access-zzg6n\") on node \"crc\" DevicePath \"\"" Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.969433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553280-44mf9" event={"ID":"5e700547-ace4-49e2-b216-0fd80fcb915a","Type":"ContainerDied","Data":"8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052"} Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.969535 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553280-44mf9" Mar 11 02:40:05 crc kubenswrapper[4744]: I0311 02:40:05.969588 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f941461c3d8b1f012fdbf96d1c21e0f683c6b9c74e6aa916ed1a7df1ceb0052" Mar 11 02:40:06 crc kubenswrapper[4744]: I0311 02:40:06.061635 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553274-4z95m"] Mar 11 02:40:06 crc kubenswrapper[4744]: I0311 02:40:06.067198 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553274-4z95m"] Mar 11 02:40:07 crc kubenswrapper[4744]: I0311 02:40:07.995188 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca71392-94d0-4b70-9dd7-2acb0deda0e0" path="/var/lib/kubelet/pods/5ca71392-94d0-4b70-9dd7-2acb0deda0e0/volumes" Mar 11 02:40:12 crc kubenswrapper[4744]: I0311 02:40:12.409648 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:40:12 crc kubenswrapper[4744]: I0311 02:40:12.410317 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:40:18 crc kubenswrapper[4744]: I0311 02:40:18.147647 4744 scope.go:117] "RemoveContainer" containerID="d8ffc88fe2600f1c9b1fc84804dca89e6266e47e67064e0fb6c342fa47bd3c10" Mar 11 02:40:42 crc kubenswrapper[4744]: I0311 02:40:42.409301 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:40:42 crc kubenswrapper[4744]: I0311 02:40:42.410334 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:40:42 crc kubenswrapper[4744]: I0311 02:40:42.410426 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:40:42 crc kubenswrapper[4744]: I0311 02:40:42.411640 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:40:42 crc kubenswrapper[4744]: I0311 02:40:42.411787 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" gracePeriod=600 Mar 11 02:40:42 crc kubenswrapper[4744]: E0311 02:40:42.565227 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:40:43 crc kubenswrapper[4744]: I0311 02:40:43.425982 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" exitCode=0 Mar 11 02:40:43 crc kubenswrapper[4744]: I0311 02:40:43.426226 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7"} Mar 11 02:40:43 crc kubenswrapper[4744]: I0311 02:40:43.426587 4744 scope.go:117] "RemoveContainer" containerID="cd8c1772eb249dc13a76fd8580fc37b972dc3bd9cbe7efcd80c33ed6959bc79f" Mar 11 02:40:43 crc kubenswrapper[4744]: I0311 02:40:43.427737 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:40:43 crc kubenswrapper[4744]: E0311 02:40:43.428403 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:40:57 crc kubenswrapper[4744]: I0311 02:40:57.975552 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:40:57 crc kubenswrapper[4744]: E0311 02:40:57.976888 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:41:10 crc kubenswrapper[4744]: I0311 02:41:10.976299 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:41:10 crc kubenswrapper[4744]: E0311 02:41:10.977353 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:41:21 crc kubenswrapper[4744]: I0311 02:41:21.975676 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:41:21 crc kubenswrapper[4744]: E0311 02:41:21.978903 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:41:33 crc kubenswrapper[4744]: I0311 02:41:33.983969 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:41:33 crc kubenswrapper[4744]: E0311 02:41:33.985257 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:41:47 crc kubenswrapper[4744]: I0311 02:41:47.975417 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:41:47 crc kubenswrapper[4744]: E0311 02:41:47.976507 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:41:59 crc kubenswrapper[4744]: I0311 02:41:59.975566 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:41:59 crc kubenswrapper[4744]: E0311 02:41:59.976618 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.172373 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553282-xtj65"] Mar 11 02:42:00 crc kubenswrapper[4744]: E0311 02:42:00.173297 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e700547-ace4-49e2-b216-0fd80fcb915a" containerName="oc" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.173350 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e700547-ace4-49e2-b216-0fd80fcb915a" containerName="oc" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.173716 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e700547-ace4-49e2-b216-0fd80fcb915a" containerName="oc" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.174657 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.177794 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.178398 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.179869 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.198136 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553282-xtj65"] Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.325355 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphkf\" (UniqueName: \"kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf\") pod \"auto-csr-approver-29553282-xtj65\" (UID: \"d38d875a-4ced-4b03-9d48-e28f25888a1b\") " pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.427445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jphkf\" (UniqueName: \"kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf\") pod \"auto-csr-approver-29553282-xtj65\" (UID: \"d38d875a-4ced-4b03-9d48-e28f25888a1b\") " pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.460764 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphkf\" (UniqueName: \"kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf\") pod \"auto-csr-approver-29553282-xtj65\" (UID: \"d38d875a-4ced-4b03-9d48-e28f25888a1b\") " pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.512322 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:00 crc kubenswrapper[4744]: I0311 02:42:00.788256 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553282-xtj65"] Mar 11 02:42:01 crc kubenswrapper[4744]: I0311 02:42:01.277886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553282-xtj65" event={"ID":"d38d875a-4ced-4b03-9d48-e28f25888a1b","Type":"ContainerStarted","Data":"014d7f2a738865bb7c8dd4de001023c2c143319a8b159439f1859e96484597bd"} Mar 11 02:42:02 crc kubenswrapper[4744]: I0311 02:42:02.314802 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553282-xtj65" event={"ID":"d38d875a-4ced-4b03-9d48-e28f25888a1b","Type":"ContainerStarted","Data":"8195e789fb44a7f42c07d4aa739650c54ba686ab4836dbc82ee8f6cf38509f36"} Mar 11 02:42:02 crc kubenswrapper[4744]: I0311 02:42:02.334953 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553282-xtj65" podStartSLOduration=1.426168829 podStartE2EDuration="2.334936994s" podCreationTimestamp="2026-03-11 02:42:00 +0000 UTC" firstStartedPulling="2026-03-11 02:42:00.785735469 +0000 UTC m=+6477.589953084" lastFinishedPulling="2026-03-11 02:42:01.694503634 +0000 UTC m=+6478.498721249" observedRunningTime="2026-03-11 02:42:02.333161069 +0000 UTC m=+6479.137378674" watchObservedRunningTime="2026-03-11 02:42:02.334936994 +0000 UTC m=+6479.139154599" Mar 11 02:42:03 crc kubenswrapper[4744]: I0311 02:42:03.326464 4744 generic.go:334] "Generic (PLEG): container finished" podID="d38d875a-4ced-4b03-9d48-e28f25888a1b" containerID="8195e789fb44a7f42c07d4aa739650c54ba686ab4836dbc82ee8f6cf38509f36" exitCode=0 Mar 11 02:42:03 crc kubenswrapper[4744]: I0311 02:42:03.326775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553282-xtj65" event={"ID":"d38d875a-4ced-4b03-9d48-e28f25888a1b","Type":"ContainerDied","Data":"8195e789fb44a7f42c07d4aa739650c54ba686ab4836dbc82ee8f6cf38509f36"} Mar 11 02:42:04 crc kubenswrapper[4744]: I0311 02:42:04.803403 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:04 crc kubenswrapper[4744]: I0311 02:42:04.823366 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jphkf\" (UniqueName: \"kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf\") pod \"d38d875a-4ced-4b03-9d48-e28f25888a1b\" (UID: \"d38d875a-4ced-4b03-9d48-e28f25888a1b\") " Mar 11 02:42:04 crc kubenswrapper[4744]: I0311 02:42:04.831000 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf" (OuterVolumeSpecName: "kube-api-access-jphkf") pod "d38d875a-4ced-4b03-9d48-e28f25888a1b" (UID: "d38d875a-4ced-4b03-9d48-e28f25888a1b"). InnerVolumeSpecName "kube-api-access-jphkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:42:04 crc kubenswrapper[4744]: I0311 02:42:04.925611 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jphkf\" (UniqueName: \"kubernetes.io/projected/d38d875a-4ced-4b03-9d48-e28f25888a1b-kube-api-access-jphkf\") on node \"crc\" DevicePath \"\"" Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.352475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553282-xtj65" event={"ID":"d38d875a-4ced-4b03-9d48-e28f25888a1b","Type":"ContainerDied","Data":"014d7f2a738865bb7c8dd4de001023c2c143319a8b159439f1859e96484597bd"} Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.352554 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014d7f2a738865bb7c8dd4de001023c2c143319a8b159439f1859e96484597bd" Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.352640 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553282-xtj65" Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.438132 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553276-rgjfz"] Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.447309 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553276-rgjfz"] Mar 11 02:42:05 crc kubenswrapper[4744]: I0311 02:42:05.997293 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75d4a39-c076-4584-9ede-73f57c608141" path="/var/lib/kubelet/pods/d75d4a39-c076-4584-9ede-73f57c608141/volumes" Mar 11 02:42:10 crc kubenswrapper[4744]: I0311 02:42:10.975747 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:42:10 crc kubenswrapper[4744]: E0311 02:42:10.979274 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:42:18 crc kubenswrapper[4744]: I0311 02:42:18.276757 4744 scope.go:117] "RemoveContainer" containerID="106243b38d36c0556276a177f8344dfc32251b074df1cdf9839fde5e6f424e69" Mar 11 02:42:22 crc kubenswrapper[4744]: I0311 02:42:22.975820 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:42:22 crc kubenswrapper[4744]: E0311 02:42:22.976798 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:42:36 crc kubenswrapper[4744]: I0311 02:42:36.974828 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:42:36 crc kubenswrapper[4744]: E0311 02:42:36.975857 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:42:49 crc kubenswrapper[4744]: I0311 02:42:49.975555 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:42:49 crc kubenswrapper[4744]: E0311 02:42:49.976712 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:43:01 crc kubenswrapper[4744]: I0311 02:43:01.975016 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:43:01 crc kubenswrapper[4744]: E0311 02:43:01.976121 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:43:13 crc kubenswrapper[4744]: I0311 02:43:13.980073 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:43:13 crc kubenswrapper[4744]: E0311 02:43:13.981252 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:43:28 crc kubenswrapper[4744]: I0311 02:43:28.974606 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:43:28 crc kubenswrapper[4744]: E0311 02:43:28.975634 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:43:41 crc kubenswrapper[4744]: I0311 02:43:41.980774 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:43:41 crc kubenswrapper[4744]: E0311 02:43:41.981758 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:43:55 crc kubenswrapper[4744]: I0311 02:43:55.975309 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:43:55 crc kubenswrapper[4744]: E0311 02:43:55.976261 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.163595 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553284-86ggt"] Mar 11 02:44:00 crc kubenswrapper[4744]: E0311 02:44:00.164360 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38d875a-4ced-4b03-9d48-e28f25888a1b" containerName="oc" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.164381 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38d875a-4ced-4b03-9d48-e28f25888a1b" containerName="oc" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.164707 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38d875a-4ced-4b03-9d48-e28f25888a1b" containerName="oc" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.166597 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.169088 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.169251 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.171410 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.193657 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553284-86ggt"] Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.346991 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fdv\" (UniqueName: \"kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv\") pod \"auto-csr-approver-29553284-86ggt\" (UID: \"48f900d5-0d77-43f3-a274-5c5488b8b03c\") " pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.448720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fdv\" (UniqueName: \"kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv\") pod \"auto-csr-approver-29553284-86ggt\" (UID: \"48f900d5-0d77-43f3-a274-5c5488b8b03c\") " pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.470693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fdv\" (UniqueName: \"kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv\") pod \"auto-csr-approver-29553284-86ggt\" (UID: \"48f900d5-0d77-43f3-a274-5c5488b8b03c\") " pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.498657 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.780675 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553284-86ggt"] Mar 11 02:44:00 crc kubenswrapper[4744]: W0311 02:44:00.787607 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f900d5_0d77_43f3_a274_5c5488b8b03c.slice/crio-17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e WatchSource:0}: Error finding container 17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e: Status 404 returned error can't find the container with id 17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e Mar 11 02:44:00 crc kubenswrapper[4744]: I0311 02:44:00.791996 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:44:01 crc kubenswrapper[4744]: I0311 02:44:01.530066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553284-86ggt" event={"ID":"48f900d5-0d77-43f3-a274-5c5488b8b03c","Type":"ContainerStarted","Data":"17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e"} Mar 11 02:44:02 crc kubenswrapper[4744]: I0311 02:44:02.544134 4744 generic.go:334] "Generic (PLEG): container finished" podID="48f900d5-0d77-43f3-a274-5c5488b8b03c" containerID="55e124aaa63812ade25310b009d2693f30abec3c7d241b5553b1b20a205f5d6a" exitCode=0 Mar 11 02:44:02 crc kubenswrapper[4744]: I0311 02:44:02.544301 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553284-86ggt" event={"ID":"48f900d5-0d77-43f3-a274-5c5488b8b03c","Type":"ContainerDied","Data":"55e124aaa63812ade25310b009d2693f30abec3c7d241b5553b1b20a205f5d6a"} Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.005325 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.143041 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fdv\" (UniqueName: \"kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv\") pod \"48f900d5-0d77-43f3-a274-5c5488b8b03c\" (UID: \"48f900d5-0d77-43f3-a274-5c5488b8b03c\") " Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.165307 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv" (OuterVolumeSpecName: "kube-api-access-v6fdv") pod "48f900d5-0d77-43f3-a274-5c5488b8b03c" (UID: "48f900d5-0d77-43f3-a274-5c5488b8b03c"). InnerVolumeSpecName "kube-api-access-v6fdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.244992 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fdv\" (UniqueName: \"kubernetes.io/projected/48f900d5-0d77-43f3-a274-5c5488b8b03c-kube-api-access-v6fdv\") on node \"crc\" DevicePath \"\"" Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.563902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553284-86ggt" event={"ID":"48f900d5-0d77-43f3-a274-5c5488b8b03c","Type":"ContainerDied","Data":"17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e"} Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.563959 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ee523c1845fb709adceff87a46286bc170c77aa4b8bc5673ae2804bda2a84e" Mar 11 02:44:04 crc kubenswrapper[4744]: I0311 02:44:04.564032 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553284-86ggt" Mar 11 02:44:05 crc kubenswrapper[4744]: I0311 02:44:05.108191 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553278-6hcqx"] Mar 11 02:44:05 crc kubenswrapper[4744]: I0311 02:44:05.115966 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553278-6hcqx"] Mar 11 02:44:05 crc kubenswrapper[4744]: I0311 02:44:05.995864 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2438c214-5451-4ee6-b516-801897f7afc3" path="/var/lib/kubelet/pods/2438c214-5451-4ee6-b516-801897f7afc3/volumes" Mar 11 02:44:10 crc kubenswrapper[4744]: I0311 02:44:10.974951 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:44:10 crc kubenswrapper[4744]: E0311 02:44:10.977571 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:44:18 crc kubenswrapper[4744]: I0311 02:44:18.410721 4744 scope.go:117] "RemoveContainer" containerID="3d9a88545c4ec5eb96d01c5266e0505f2cae6f7420ea65516225785398fa1d80" Mar 11 02:44:23 crc kubenswrapper[4744]: I0311 02:44:23.982041 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:44:23 crc kubenswrapper[4744]: E0311 02:44:23.983011 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:44:35 crc kubenswrapper[4744]: I0311 02:44:35.974757 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:44:35 crc kubenswrapper[4744]: E0311 02:44:35.975756 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:44:47 crc kubenswrapper[4744]: I0311 02:44:47.977632 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:44:47 crc kubenswrapper[4744]: E0311 02:44:47.978708 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.174850 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w"] Mar 11 02:45:00 crc kubenswrapper[4744]: E0311 02:45:00.175943 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f900d5-0d77-43f3-a274-5c5488b8b03c" containerName="oc" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.176451 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f900d5-0d77-43f3-a274-5c5488b8b03c" containerName="oc" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.176816 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f900d5-0d77-43f3-a274-5c5488b8b03c" containerName="oc" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.177695 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.181351 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.183028 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.189927 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w"] Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.293674 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.293999 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.294565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsbb\" (UniqueName: \"kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.399999 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsbb\" (UniqueName: \"kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.400212 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.400383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.401786 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.412181 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.436913 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsbb\" (UniqueName: \"kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb\") pod \"collect-profiles-29553285-qr99w\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.519926 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:00 crc kubenswrapper[4744]: I0311 02:45:00.826932 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w"] Mar 11 02:45:01 crc kubenswrapper[4744]: I0311 02:45:01.180374 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" event={"ID":"90e8981e-7421-4e08-9bf3-d2fc5a655c2e","Type":"ContainerStarted","Data":"75993f40c3a8bfbbd91de5dc63b64b6123f462a31ce64cf33838ea5a0b0374a6"} Mar 11 02:45:01 crc kubenswrapper[4744]: I0311 02:45:01.180435 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" event={"ID":"90e8981e-7421-4e08-9bf3-d2fc5a655c2e","Type":"ContainerStarted","Data":"ba3e28a26cc5961e28fca24c64b10dfbaef85711f5dde2f0277248268066d91e"} Mar 11 02:45:01 crc kubenswrapper[4744]: I0311 02:45:01.209731 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" podStartSLOduration=1.209715867 podStartE2EDuration="1.209715867s" podCreationTimestamp="2026-03-11 02:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 02:45:01.20624058 +0000 UTC m=+6658.010458185" watchObservedRunningTime="2026-03-11 02:45:01.209715867 +0000 UTC m=+6658.013933472" Mar 11 02:45:01 crc kubenswrapper[4744]: I0311 02:45:01.975376 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:45:01 crc kubenswrapper[4744]: E0311 02:45:01.975854 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:45:02 crc kubenswrapper[4744]: I0311 02:45:02.190762 4744 generic.go:334] "Generic (PLEG): container finished" podID="90e8981e-7421-4e08-9bf3-d2fc5a655c2e" containerID="75993f40c3a8bfbbd91de5dc63b64b6123f462a31ce64cf33838ea5a0b0374a6" exitCode=0 Mar 11 02:45:02 crc kubenswrapper[4744]: I0311 02:45:02.190823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" event={"ID":"90e8981e-7421-4e08-9bf3-d2fc5a655c2e","Type":"ContainerDied","Data":"75993f40c3a8bfbbd91de5dc63b64b6123f462a31ce64cf33838ea5a0b0374a6"} Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.584089 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.661210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwsbb\" (UniqueName: \"kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb\") pod \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.661265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume\") pod \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.661425 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume\") pod \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\" (UID: \"90e8981e-7421-4e08-9bf3-d2fc5a655c2e\") " Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.663174 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "90e8981e-7421-4e08-9bf3-d2fc5a655c2e" (UID: "90e8981e-7421-4e08-9bf3-d2fc5a655c2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.670001 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "90e8981e-7421-4e08-9bf3-d2fc5a655c2e" (UID: "90e8981e-7421-4e08-9bf3-d2fc5a655c2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.684698 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb" (OuterVolumeSpecName: "kube-api-access-vwsbb") pod "90e8981e-7421-4e08-9bf3-d2fc5a655c2e" (UID: "90e8981e-7421-4e08-9bf3-d2fc5a655c2e"). InnerVolumeSpecName "kube-api-access-vwsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.763833 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.763905 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwsbb\" (UniqueName: \"kubernetes.io/projected/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-kube-api-access-vwsbb\") on node \"crc\" DevicePath \"\"" Mar 11 02:45:03 crc kubenswrapper[4744]: I0311 02:45:03.763938 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90e8981e-7421-4e08-9bf3-d2fc5a655c2e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 02:45:04 crc kubenswrapper[4744]: I0311 02:45:04.213586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" event={"ID":"90e8981e-7421-4e08-9bf3-d2fc5a655c2e","Type":"ContainerDied","Data":"ba3e28a26cc5961e28fca24c64b10dfbaef85711f5dde2f0277248268066d91e"} Mar 11 02:45:04 crc kubenswrapper[4744]: I0311 02:45:04.213639 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba3e28a26cc5961e28fca24c64b10dfbaef85711f5dde2f0277248268066d91e" Mar 11 02:45:04 crc kubenswrapper[4744]: I0311 02:45:04.213721 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553285-qr99w" Mar 11 02:45:04 crc kubenswrapper[4744]: I0311 02:45:04.311815 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p"] Mar 11 02:45:04 crc kubenswrapper[4744]: I0311 02:45:04.325884 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553240-xqm9p"] Mar 11 02:45:05 crc kubenswrapper[4744]: I0311 02:45:05.985065 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7e1390-6f32-40d5-adec-3769768dae25" path="/var/lib/kubelet/pods/9e7e1390-6f32-40d5-adec-3769768dae25/volumes" Mar 11 02:45:15 crc kubenswrapper[4744]: I0311 02:45:15.977566 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:45:15 crc kubenswrapper[4744]: E0311 02:45:15.978281 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:45:18 crc kubenswrapper[4744]: I0311 02:45:18.485453 4744 scope.go:117] "RemoveContainer" containerID="4ba9951769dc9cbd8f5ca426397018c1fd0f022b4f623255d5d0354ce3981c5b" Mar 11 02:45:27 crc kubenswrapper[4744]: I0311 02:45:27.975657 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:45:27 crc kubenswrapper[4744]: E0311 02:45:27.978805 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.553768 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt92c/must-gather-c62jv"] Mar 11 02:45:36 crc kubenswrapper[4744]: E0311 02:45:36.554483 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e8981e-7421-4e08-9bf3-d2fc5a655c2e" containerName="collect-profiles" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.554496 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e8981e-7421-4e08-9bf3-d2fc5a655c2e" containerName="collect-profiles" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.554685 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e8981e-7421-4e08-9bf3-d2fc5a655c2e" containerName="collect-profiles" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.555463 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.557848 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jt92c"/"default-dockercfg-6s558" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.560956 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jt92c"/"openshift-service-ca.crt" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.562760 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jt92c"/"kube-root-ca.crt" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.563752 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jt92c/must-gather-c62jv"] Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.730871 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6b4t\" (UniqueName: \"kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.730981 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.833171 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.833410 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6b4t\" (UniqueName: \"kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.833749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.868323 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6b4t\" (UniqueName: \"kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t\") pod \"must-gather-c62jv\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:36 crc kubenswrapper[4744]: I0311 02:45:36.880842 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:45:37 crc kubenswrapper[4744]: W0311 02:45:37.347641 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76eef030_d104_4d21_85b9_d3d5be6456f5.slice/crio-35e5ad951e4e17b7bebb4c07b3c8fc369addfe1aea5d8a0410d58e0c6fa93f17 WatchSource:0}: Error finding container 35e5ad951e4e17b7bebb4c07b3c8fc369addfe1aea5d8a0410d58e0c6fa93f17: Status 404 returned error can't find the container with id 35e5ad951e4e17b7bebb4c07b3c8fc369addfe1aea5d8a0410d58e0c6fa93f17 Mar 11 02:45:37 crc kubenswrapper[4744]: I0311 02:45:37.354058 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jt92c/must-gather-c62jv"] Mar 11 02:45:37 crc kubenswrapper[4744]: I0311 02:45:37.546291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/must-gather-c62jv" event={"ID":"76eef030-d104-4d21-85b9-d3d5be6456f5","Type":"ContainerStarted","Data":"35e5ad951e4e17b7bebb4c07b3c8fc369addfe1aea5d8a0410d58e0c6fa93f17"} Mar 11 02:45:42 crc kubenswrapper[4744]: I0311 02:45:42.975912 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:45:44 crc kubenswrapper[4744]: I0311 02:45:44.641422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b"} Mar 11 02:45:44 crc kubenswrapper[4744]: I0311 02:45:44.645899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/must-gather-c62jv" event={"ID":"76eef030-d104-4d21-85b9-d3d5be6456f5","Type":"ContainerStarted","Data":"b549092ef85ef388f7f6865bebaedb4c2adb7ea129e5f1d56808b134338cfedb"} Mar 11 02:45:44 crc kubenswrapper[4744]: I0311 02:45:44.645945 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/must-gather-c62jv" event={"ID":"76eef030-d104-4d21-85b9-d3d5be6456f5","Type":"ContainerStarted","Data":"b94f773583c3a98f3b770276ba85b410eb9cadeae5eabeb798e897181aabce29"} Mar 11 02:45:44 crc kubenswrapper[4744]: I0311 02:45:44.730014 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jt92c/must-gather-c62jv" podStartSLOduration=2.645408581 podStartE2EDuration="8.729994584s" podCreationTimestamp="2026-03-11 02:45:36 +0000 UTC" firstStartedPulling="2026-03-11 02:45:37.35049778 +0000 UTC m=+6694.154715425" lastFinishedPulling="2026-03-11 02:45:43.435083813 +0000 UTC m=+6700.239301428" observedRunningTime="2026-03-11 02:45:44.713765354 +0000 UTC m=+6701.517982989" watchObservedRunningTime="2026-03-11 02:45:44.729994584 +0000 UTC m=+6701.534212199" Mar 11 02:45:46 crc kubenswrapper[4744]: I0311 02:45:46.909127 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt92c/crc-debug-jjjvt"] Mar 11 02:45:46 crc kubenswrapper[4744]: I0311 02:45:46.910488 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.046973 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prnf\" (UniqueName: \"kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.047556 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.149258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.149361 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prnf\" (UniqueName: \"kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.151104 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.177034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prnf\" (UniqueName: \"kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf\") pod \"crc-debug-jjjvt\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.242540 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:45:47 crc kubenswrapper[4744]: W0311 02:45:47.273591 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode630e3e9_dbe6_4671_ae2a_34a3cad21197.slice/crio-1589e23468324690e366b49bbf3255bc968e125aa24f833cc1f00189782c3aa6 WatchSource:0}: Error finding container 1589e23468324690e366b49bbf3255bc968e125aa24f833cc1f00189782c3aa6: Status 404 returned error can't find the container with id 1589e23468324690e366b49bbf3255bc968e125aa24f833cc1f00189782c3aa6 Mar 11 02:45:47 crc kubenswrapper[4744]: I0311 02:45:47.671160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" event={"ID":"e630e3e9-dbe6-4671-ae2a-34a3cad21197","Type":"ContainerStarted","Data":"1589e23468324690e366b49bbf3255bc968e125aa24f833cc1f00189782c3aa6"} Mar 11 02:45:58 crc kubenswrapper[4744]: I0311 02:45:58.748925 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" event={"ID":"e630e3e9-dbe6-4671-ae2a-34a3cad21197","Type":"ContainerStarted","Data":"2c18008c454c81d2418635809c131d5a5bb3f609efd347876a2e33b7ae2ad4fd"} Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.176931 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" podStartSLOduration=3.731749749 podStartE2EDuration="14.176904945s" podCreationTimestamp="2026-03-11 02:45:46 +0000 UTC" firstStartedPulling="2026-03-11 02:45:47.277135882 +0000 UTC m=+6704.081353527" lastFinishedPulling="2026-03-11 02:45:57.722291108 +0000 UTC m=+6714.526508723" observedRunningTime="2026-03-11 02:45:58.766006101 +0000 UTC m=+6715.570223716" watchObservedRunningTime="2026-03-11 02:46:00.176904945 +0000 UTC m=+6716.981122570" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.179899 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553286-gvvcf"] Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.181072 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.184361 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.184622 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.185234 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.188581 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553286-gvvcf"] Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.267430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjcz\" (UniqueName: \"kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz\") pod \"auto-csr-approver-29553286-gvvcf\" (UID: \"58a97075-eefa-4f1a-b520-b3ce094b7413\") " pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.368623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjcz\" (UniqueName: \"kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz\") pod \"auto-csr-approver-29553286-gvvcf\" (UID: \"58a97075-eefa-4f1a-b520-b3ce094b7413\") " pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.398174 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjcz\" (UniqueName: \"kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz\") pod \"auto-csr-approver-29553286-gvvcf\" (UID: \"58a97075-eefa-4f1a-b520-b3ce094b7413\") " pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.500156 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:00 crc kubenswrapper[4744]: I0311 02:46:00.979872 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553286-gvvcf"] Mar 11 02:46:01 crc kubenswrapper[4744]: I0311 02:46:01.776426 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" event={"ID":"58a97075-eefa-4f1a-b520-b3ce094b7413","Type":"ContainerStarted","Data":"fc4fe37d7f71822daab940da50ca916592a0ec62a9841695dea17aa50172f3d6"} Mar 11 02:46:04 crc kubenswrapper[4744]: I0311 02:46:04.810299 4744 generic.go:334] "Generic (PLEG): container finished" podID="58a97075-eefa-4f1a-b520-b3ce094b7413" containerID="b4bc7a03dafbe98d63eb695bcb550fed4d0f8e0f5ec40f99bdec48b7b01b08a0" exitCode=0 Mar 11 02:46:04 crc kubenswrapper[4744]: I0311 02:46:04.810541 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" event={"ID":"58a97075-eefa-4f1a-b520-b3ce094b7413","Type":"ContainerDied","Data":"b4bc7a03dafbe98d63eb695bcb550fed4d0f8e0f5ec40f99bdec48b7b01b08a0"} Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.234189 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.267124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvjcz\" (UniqueName: \"kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz\") pod \"58a97075-eefa-4f1a-b520-b3ce094b7413\" (UID: \"58a97075-eefa-4f1a-b520-b3ce094b7413\") " Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.281695 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz" (OuterVolumeSpecName: "kube-api-access-fvjcz") pod "58a97075-eefa-4f1a-b520-b3ce094b7413" (UID: "58a97075-eefa-4f1a-b520-b3ce094b7413"). InnerVolumeSpecName "kube-api-access-fvjcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.369958 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvjcz\" (UniqueName: \"kubernetes.io/projected/58a97075-eefa-4f1a-b520-b3ce094b7413-kube-api-access-fvjcz\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.829580 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" event={"ID":"58a97075-eefa-4f1a-b520-b3ce094b7413","Type":"ContainerDied","Data":"fc4fe37d7f71822daab940da50ca916592a0ec62a9841695dea17aa50172f3d6"} Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.829973 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc4fe37d7f71822daab940da50ca916592a0ec62a9841695dea17aa50172f3d6" Mar 11 02:46:06 crc kubenswrapper[4744]: I0311 02:46:06.829712 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553286-gvvcf" Mar 11 02:46:07 crc kubenswrapper[4744]: I0311 02:46:07.294357 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553280-44mf9"] Mar 11 02:46:07 crc kubenswrapper[4744]: I0311 02:46:07.300576 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553280-44mf9"] Mar 11 02:46:07 crc kubenswrapper[4744]: I0311 02:46:07.994179 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e700547-ace4-49e2-b216-0fd80fcb915a" path="/var/lib/kubelet/pods/5e700547-ace4-49e2-b216-0fd80fcb915a/volumes" Mar 11 02:46:13 crc kubenswrapper[4744]: I0311 02:46:13.892675 4744 generic.go:334] "Generic (PLEG): container finished" podID="e630e3e9-dbe6-4671-ae2a-34a3cad21197" containerID="2c18008c454c81d2418635809c131d5a5bb3f609efd347876a2e33b7ae2ad4fd" exitCode=0 Mar 11 02:46:13 crc kubenswrapper[4744]: I0311 02:46:13.892764 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" event={"ID":"e630e3e9-dbe6-4671-ae2a-34a3cad21197","Type":"ContainerDied","Data":"2c18008c454c81d2418635809c131d5a5bb3f609efd347876a2e33b7ae2ad4fd"} Mar 11 02:46:14 crc kubenswrapper[4744]: I0311 02:46:14.985620 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.035641 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt92c/crc-debug-jjjvt"] Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.044873 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt92c/crc-debug-jjjvt"] Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.124852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host\") pod \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.125033 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host" (OuterVolumeSpecName: "host") pod "e630e3e9-dbe6-4671-ae2a-34a3cad21197" (UID: "e630e3e9-dbe6-4671-ae2a-34a3cad21197"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.125702 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9prnf\" (UniqueName: \"kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf\") pod \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\" (UID: \"e630e3e9-dbe6-4671-ae2a-34a3cad21197\") " Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.126649 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e630e3e9-dbe6-4671-ae2a-34a3cad21197-host\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.131668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf" (OuterVolumeSpecName: "kube-api-access-9prnf") pod "e630e3e9-dbe6-4671-ae2a-34a3cad21197" (UID: "e630e3e9-dbe6-4671-ae2a-34a3cad21197"). InnerVolumeSpecName "kube-api-access-9prnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.228425 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9prnf\" (UniqueName: \"kubernetes.io/projected/e630e3e9-dbe6-4671-ae2a-34a3cad21197-kube-api-access-9prnf\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.914724 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1589e23468324690e366b49bbf3255bc968e125aa24f833cc1f00189782c3aa6" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.914819 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-jjjvt" Mar 11 02:46:15 crc kubenswrapper[4744]: I0311 02:46:15.997555 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e630e3e9-dbe6-4671-ae2a-34a3cad21197" path="/var/lib/kubelet/pods/e630e3e9-dbe6-4671-ae2a-34a3cad21197/volumes" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.245343 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt92c/crc-debug-dk2gk"] Mar 11 02:46:16 crc kubenswrapper[4744]: E0311 02:46:16.245831 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e630e3e9-dbe6-4671-ae2a-34a3cad21197" containerName="container-00" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.245859 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e630e3e9-dbe6-4671-ae2a-34a3cad21197" containerName="container-00" Mar 11 02:46:16 crc kubenswrapper[4744]: E0311 02:46:16.245892 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a97075-eefa-4f1a-b520-b3ce094b7413" containerName="oc" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.245904 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a97075-eefa-4f1a-b520-b3ce094b7413" containerName="oc" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.246217 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a97075-eefa-4f1a-b520-b3ce094b7413" containerName="oc" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.246246 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e630e3e9-dbe6-4671-ae2a-34a3cad21197" containerName="container-00" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.247204 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.349612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.350225 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krqzp\" (UniqueName: \"kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.451713 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krqzp\" (UniqueName: \"kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.451884 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.452113 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.476895 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krqzp\" (UniqueName: \"kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp\") pod \"crc-debug-dk2gk\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.575896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:16 crc kubenswrapper[4744]: W0311 02:46:16.630122 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f0f9e0_17a3_4223_a92a_b36e02d268d0.slice/crio-b94fb95d968bc04b3aba84f69bea695e44db6db5a565a914993400d20949633e WatchSource:0}: Error finding container b94fb95d968bc04b3aba84f69bea695e44db6db5a565a914993400d20949633e: Status 404 returned error can't find the container with id b94fb95d968bc04b3aba84f69bea695e44db6db5a565a914993400d20949633e Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.929411 4744 generic.go:334] "Generic (PLEG): container finished" podID="d0f0f9e0-17a3-4223-a92a-b36e02d268d0" containerID="964c9cbf35b2cc144d816415714f9067239426ca4d0fa9c09e9547b743408c47" exitCode=1 Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.929545 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" event={"ID":"d0f0f9e0-17a3-4223-a92a-b36e02d268d0","Type":"ContainerDied","Data":"964c9cbf35b2cc144d816415714f9067239426ca4d0fa9c09e9547b743408c47"} Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.929732 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" event={"ID":"d0f0f9e0-17a3-4223-a92a-b36e02d268d0","Type":"ContainerStarted","Data":"b94fb95d968bc04b3aba84f69bea695e44db6db5a565a914993400d20949633e"} Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.973598 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt92c/crc-debug-dk2gk"] Mar 11 02:46:16 crc kubenswrapper[4744]: I0311 02:46:16.983622 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt92c/crc-debug-dk2gk"] Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.035409 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.185008 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host\") pod \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.185176 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host" (OuterVolumeSpecName: "host") pod "d0f0f9e0-17a3-4223-a92a-b36e02d268d0" (UID: "d0f0f9e0-17a3-4223-a92a-b36e02d268d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.185789 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krqzp\" (UniqueName: \"kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp\") pod \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\" (UID: \"d0f0f9e0-17a3-4223-a92a-b36e02d268d0\") " Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.186498 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-host\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.191909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp" (OuterVolumeSpecName: "kube-api-access-krqzp") pod "d0f0f9e0-17a3-4223-a92a-b36e02d268d0" (UID: "d0f0f9e0-17a3-4223-a92a-b36e02d268d0"). InnerVolumeSpecName "kube-api-access-krqzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.288102 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krqzp\" (UniqueName: \"kubernetes.io/projected/d0f0f9e0-17a3-4223-a92a-b36e02d268d0-kube-api-access-krqzp\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.571927 4744 scope.go:117] "RemoveContainer" containerID="17da550c8b0e104ba33c462670009067131bb69dd24e6d3e97177480cd2e832f" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.946938 4744 scope.go:117] "RemoveContainer" containerID="964c9cbf35b2cc144d816415714f9067239426ca4d0fa9c09e9547b743408c47" Mar 11 02:46:18 crc kubenswrapper[4744]: I0311 02:46:18.947153 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/crc-debug-dk2gk" Mar 11 02:46:19 crc kubenswrapper[4744]: I0311 02:46:19.984411 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f0f9e0-17a3-4223-a92a-b36e02d268d0" path="/var/lib/kubelet/pods/d0f0f9e0-17a3-4223-a92a-b36e02d268d0/volumes" Mar 11 02:46:26 crc kubenswrapper[4744]: I0311 02:46:26.968464 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:26 crc kubenswrapper[4744]: E0311 02:46:26.970019 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f0f9e0-17a3-4223-a92a-b36e02d268d0" containerName="container-00" Mar 11 02:46:26 crc kubenswrapper[4744]: I0311 02:46:26.970045 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f0f9e0-17a3-4223-a92a-b36e02d268d0" containerName="container-00" Mar 11 02:46:26 crc kubenswrapper[4744]: I0311 02:46:26.970339 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f0f9e0-17a3-4223-a92a-b36e02d268d0" containerName="container-00" Mar 11 02:46:26 crc kubenswrapper[4744]: I0311 02:46:26.972315 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:26 crc kubenswrapper[4744]: I0311 02:46:26.986462 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.039841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.040036 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.040130 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2km7\" (UniqueName: \"kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.141490 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2km7\" (UniqueName: \"kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.141601 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.141686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.142047 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.142109 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.164914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2km7\" (UniqueName: \"kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7\") pod \"redhat-operators-68264\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.308919 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:27 crc kubenswrapper[4744]: I0311 02:46:27.776141 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:28 crc kubenswrapper[4744]: I0311 02:46:28.061965 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca21227a-204d-477a-aae8-a7897115151f" containerID="ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192" exitCode=0 Mar 11 02:46:28 crc kubenswrapper[4744]: I0311 02:46:28.062068 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerDied","Data":"ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192"} Mar 11 02:46:28 crc kubenswrapper[4744]: I0311 02:46:28.062210 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerStarted","Data":"a06c00c1bf1e6799613f13b6e6c169b8d379168da4226d8765e8f21bf4684b43"} Mar 11 02:46:30 crc kubenswrapper[4744]: I0311 02:46:30.078863 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca21227a-204d-477a-aae8-a7897115151f" containerID="695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26" exitCode=0 Mar 11 02:46:30 crc kubenswrapper[4744]: I0311 02:46:30.079446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerDied","Data":"695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26"} Mar 11 02:46:31 crc kubenswrapper[4744]: I0311 02:46:31.095075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerStarted","Data":"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f"} Mar 11 02:46:31 crc kubenswrapper[4744]: I0311 02:46:31.124369 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-68264" podStartSLOduration=2.648018923 podStartE2EDuration="5.12435342s" podCreationTimestamp="2026-03-11 02:46:26 +0000 UTC" firstStartedPulling="2026-03-11 02:46:28.063012581 +0000 UTC m=+6744.867230186" lastFinishedPulling="2026-03-11 02:46:30.539347038 +0000 UTC m=+6747.343564683" observedRunningTime="2026-03-11 02:46:31.121359288 +0000 UTC m=+6747.925576893" watchObservedRunningTime="2026-03-11 02:46:31.12435342 +0000 UTC m=+6747.928571025" Mar 11 02:46:37 crc kubenswrapper[4744]: I0311 02:46:37.309531 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:37 crc kubenswrapper[4744]: I0311 02:46:37.309953 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:38 crc kubenswrapper[4744]: I0311 02:46:38.360006 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-68264" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="registry-server" probeResult="failure" output=< Mar 11 02:46:38 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Mar 11 02:46:38 crc kubenswrapper[4744]: > Mar 11 02:46:40 crc kubenswrapper[4744]: I0311 02:46:40.991881 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.000800 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.017848 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.124174 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.124209 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7bm\" (UniqueName: \"kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.124285 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.225843 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7bm\" (UniqueName: \"kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.225888 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.225941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.226391 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.226419 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.246309 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7bm\" (UniqueName: \"kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm\") pod \"community-operators-f4xg9\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.339149 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.492985 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59b6c5dc-qrpfm_0818d695-0611-4e9d-b2d4-4894bec77500/init/0.log" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.670181 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:41 crc kubenswrapper[4744]: W0311 02:46:41.682576 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb79fe26_1042_4b93_927f_6c570a90a49d.slice/crio-b8e7c837402e0a10c0bf8ec1e6c825d2838da4d900459af4f86719a9b932b5b4 WatchSource:0}: Error finding container b8e7c837402e0a10c0bf8ec1e6c825d2838da4d900459af4f86719a9b932b5b4: Status 404 returned error can't find the container with id b8e7c837402e0a10c0bf8ec1e6c825d2838da4d900459af4f86719a9b932b5b4 Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.806396 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59b6c5dc-qrpfm_0818d695-0611-4e9d-b2d4-4894bec77500/dnsmasq-dns/0.log" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.806643 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59b6c5dc-qrpfm_0818d695-0611-4e9d-b2d4-4894bec77500/init/0.log" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.897025 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84c66b68ff-jvb9q_8eda2e74-8afe-403d-b20d-b3953a3bed0f/keystone-api/0.log" Mar 11 02:46:41 crc kubenswrapper[4744]: I0311 02:46:41.999032 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_57813bc2-80d5-486e-8258-32b184f74ed6/adoption/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.202139 4744 generic.go:334] "Generic (PLEG): container finished" podID="db79fe26-1042-4b93-927f-6c570a90a49d" containerID="e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1" exitCode=0 Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.202370 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerDied","Data":"e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1"} Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.202396 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerStarted","Data":"b8e7c837402e0a10c0bf8ec1e6c825d2838da4d900459af4f86719a9b932b5b4"} Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.254576 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776b5477-2a1c-4938-9f48-c165db85c160/mysql-bootstrap/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.387928 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776b5477-2a1c-4938-9f48-c165db85c160/mysql-bootstrap/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.436472 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776b5477-2a1c-4938-9f48-c165db85c160/galera/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.613022 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6e19d5d-20f5-4836-afcc-a5958a01bbf2/mysql-bootstrap/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.792182 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6e19d5d-20f5-4836-afcc-a5958a01bbf2/mysql-bootstrap/0.log" Mar 11 02:46:42 crc kubenswrapper[4744]: I0311 02:46:42.802344 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a6e19d5d-20f5-4836-afcc-a5958a01bbf2/galera/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.030413 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b11cb88a-4127-48bb-9bb1-76c564f9d050/openstackclient/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.105854 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c1383c00-c842-43c0-a8c1-c81c615b6e8e/memcached/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.126432 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_deb2a1ca-b54e-4439-8889-67b4e9407b3b/adoption/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.210049 4744 generic.go:334] "Generic (PLEG): container finished" podID="db79fe26-1042-4b93-927f-6c570a90a49d" containerID="bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2" exitCode=0 Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.210089 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerDied","Data":"bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2"} Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.255396 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_11bc054e-dfda-4532-85a0-a74a6afd5e4e/ovn-northd/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.258032 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_11bc054e-dfda-4532-85a0-a74a6afd5e4e/openstack-network-exporter/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.422657 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a398ee8-1d34-45b7-a50b-82d6880407d2/openstack-network-exporter/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.438335 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a398ee8-1d34-45b7-a50b-82d6880407d2/ovsdbserver-nb/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.505003 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_95b1b29d-10d0-4950-8b51-322683a204ad/openstack-network-exporter/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.621425 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_95b1b29d-10d0-4950-8b51-322683a204ad/ovsdbserver-nb/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.654883 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_69276227-3db4-4f45-b4bf-3595c388000b/openstack-network-exporter/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.691145 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_69276227-3db4-4f45-b4bf-3595c388000b/ovsdbserver-nb/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.825404 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5efa1d3-6643-4d3e-a06b-2051d5f1664a/openstack-network-exporter/0.log" Mar 11 02:46:43 crc kubenswrapper[4744]: I0311 02:46:43.864108 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5efa1d3-6643-4d3e-a06b-2051d5f1664a/ovsdbserver-sb/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.128484 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_371fb375-551c-4443-8f31-5e952e527f4d/ovsdbserver-sb/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.163219 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_371fb375-551c-4443-8f31-5e952e527f4d/openstack-network-exporter/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.182308 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.183803 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.198907 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.220529 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerStarted","Data":"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11"} Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.249036 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f4xg9" podStartSLOduration=2.754559423 podStartE2EDuration="4.249009401s" podCreationTimestamp="2026-03-11 02:46:40 +0000 UTC" firstStartedPulling="2026-03-11 02:46:42.20553545 +0000 UTC m=+6759.009753055" lastFinishedPulling="2026-03-11 02:46:43.699985388 +0000 UTC m=+6760.504203033" observedRunningTime="2026-03-11 02:46:44.245551834 +0000 UTC m=+6761.049769429" watchObservedRunningTime="2026-03-11 02:46:44.249009401 +0000 UTC m=+6761.053227006" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.280595 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.280638 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.280679 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd48h\" (UniqueName: \"kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.334983 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013/openstack-network-exporter/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.382042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.382081 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.382119 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd48h\" (UniqueName: \"kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.382500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.382520 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.400304 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd48h\" (UniqueName: \"kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h\") pod \"redhat-marketplace-sw9hx\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.428344 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e9cc891c-dd1f-4f59-aa6f-9b51f8ae9013/ovsdbserver-sb/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.500206 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.525078 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8d74643b-a5ad-4129-a109-0d49f957b306/setup-container/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.749196 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8d74643b-a5ad-4129-a109-0d49f957b306/setup-container/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.866820 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8d74643b-a5ad-4129-a109-0d49f957b306/rabbitmq/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.874755 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_876a9769-a63b-46e0-961b-25b726ba177d/setup-container/0.log" Mar 11 02:46:44 crc kubenswrapper[4744]: I0311 02:46:44.940721 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:44 crc kubenswrapper[4744]: W0311 02:46:44.946225 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c2ead7_f71e_45fa_bb1f_7e97702113b7.slice/crio-edfb8f064e5f68b50022b304db1aa6d3bed0efb7d7aa47e0a4fb7127a0fa7ee5 WatchSource:0}: Error finding container edfb8f064e5f68b50022b304db1aa6d3bed0efb7d7aa47e0a4fb7127a0fa7ee5: Status 404 returned error can't find the container with id edfb8f064e5f68b50022b304db1aa6d3bed0efb7d7aa47e0a4fb7127a0fa7ee5 Mar 11 02:46:45 crc kubenswrapper[4744]: I0311 02:46:45.145705 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_876a9769-a63b-46e0-961b-25b726ba177d/setup-container/0.log" Mar 11 02:46:45 crc kubenswrapper[4744]: I0311 02:46:45.159054 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_876a9769-a63b-46e0-961b-25b726ba177d/rabbitmq/0.log" Mar 11 02:46:45 crc kubenswrapper[4744]: I0311 02:46:45.228787 4744 generic.go:334] "Generic (PLEG): container finished" podID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerID="8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b" exitCode=0 Mar 11 02:46:45 crc kubenswrapper[4744]: I0311 02:46:45.228887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerDied","Data":"8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b"} Mar 11 02:46:45 crc kubenswrapper[4744]: I0311 02:46:45.228948 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerStarted","Data":"edfb8f064e5f68b50022b304db1aa6d3bed0efb7d7aa47e0a4fb7127a0fa7ee5"} Mar 11 02:46:46 crc kubenswrapper[4744]: I0311 02:46:46.243099 4744 generic.go:334] "Generic (PLEG): container finished" podID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerID="8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b" exitCode=0 Mar 11 02:46:46 crc kubenswrapper[4744]: I0311 02:46:46.243178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerDied","Data":"8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b"} Mar 11 02:46:47 crc kubenswrapper[4744]: I0311 02:46:47.253577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerStarted","Data":"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388"} Mar 11 02:46:47 crc kubenswrapper[4744]: I0311 02:46:47.272088 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw9hx" podStartSLOduration=1.614490816 podStartE2EDuration="3.27207004s" podCreationTimestamp="2026-03-11 02:46:44 +0000 UTC" firstStartedPulling="2026-03-11 02:46:45.230212728 +0000 UTC m=+6762.034430333" lastFinishedPulling="2026-03-11 02:46:46.887791952 +0000 UTC m=+6763.692009557" observedRunningTime="2026-03-11 02:46:47.270939105 +0000 UTC m=+6764.075156720" watchObservedRunningTime="2026-03-11 02:46:47.27207004 +0000 UTC m=+6764.076287645" Mar 11 02:46:47 crc kubenswrapper[4744]: I0311 02:46:47.362825 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:47 crc kubenswrapper[4744]: I0311 02:46:47.414701 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:49 crc kubenswrapper[4744]: I0311 02:46:49.575799 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:49 crc kubenswrapper[4744]: I0311 02:46:49.576214 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-68264" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="registry-server" containerID="cri-o://028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f" gracePeriod=2 Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.080737 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.170161 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities\") pod \"ca21227a-204d-477a-aae8-a7897115151f\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.170254 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content\") pod \"ca21227a-204d-477a-aae8-a7897115151f\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.170369 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2km7\" (UniqueName: \"kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7\") pod \"ca21227a-204d-477a-aae8-a7897115151f\" (UID: \"ca21227a-204d-477a-aae8-a7897115151f\") " Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.173898 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities" (OuterVolumeSpecName: "utilities") pod "ca21227a-204d-477a-aae8-a7897115151f" (UID: "ca21227a-204d-477a-aae8-a7897115151f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.177081 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7" (OuterVolumeSpecName: "kube-api-access-z2km7") pod "ca21227a-204d-477a-aae8-a7897115151f" (UID: "ca21227a-204d-477a-aae8-a7897115151f"). InnerVolumeSpecName "kube-api-access-z2km7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.275160 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.275195 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2km7\" (UniqueName: \"kubernetes.io/projected/ca21227a-204d-477a-aae8-a7897115151f-kube-api-access-z2km7\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.284893 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca21227a-204d-477a-aae8-a7897115151f" containerID="028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f" exitCode=0 Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.284961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerDied","Data":"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f"} Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.285000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68264" event={"ID":"ca21227a-204d-477a-aae8-a7897115151f","Type":"ContainerDied","Data":"a06c00c1bf1e6799613f13b6e6c169b8d379168da4226d8765e8f21bf4684b43"} Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.285004 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68264" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.285028 4744 scope.go:117] "RemoveContainer" containerID="028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.294924 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca21227a-204d-477a-aae8-a7897115151f" (UID: "ca21227a-204d-477a-aae8-a7897115151f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.310076 4744 scope.go:117] "RemoveContainer" containerID="695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.352562 4744 scope.go:117] "RemoveContainer" containerID="ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.377191 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21227a-204d-477a-aae8-a7897115151f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.387971 4744 scope.go:117] "RemoveContainer" containerID="028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f" Mar 11 02:46:50 crc kubenswrapper[4744]: E0311 02:46:50.388612 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f\": container with ID starting with 028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f not found: ID does not exist" containerID="028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.388660 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f"} err="failed to get container status \"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f\": rpc error: code = NotFound desc = could not find container \"028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f\": container with ID starting with 028255a82ab0501631ebb4dc7f36d24d832b604ebd2f4fbe7e7b4c076414764f not found: ID does not exist" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.388695 4744 scope.go:117] "RemoveContainer" containerID="695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26" Mar 11 02:46:50 crc kubenswrapper[4744]: E0311 02:46:50.389049 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26\": container with ID starting with 695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26 not found: ID does not exist" containerID="695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.389074 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26"} err="failed to get container status \"695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26\": rpc error: code = NotFound desc = could not find container \"695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26\": container with ID starting with 695271db24390bd07f37635f0cf7b245e98bca6befcf0ffbcb88dca5f9630f26 not found: ID does not exist" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.389093 4744 scope.go:117] "RemoveContainer" containerID="ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192" Mar 11 02:46:50 crc kubenswrapper[4744]: E0311 02:46:50.389449 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192\": container with ID starting with ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192 not found: ID does not exist" containerID="ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.389477 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192"} err="failed to get container status \"ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192\": rpc error: code = NotFound desc = could not find container \"ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192\": container with ID starting with ad73e8a5ed32a712be56fabb4c3903ffb6b4e7cd734a460343c2d182f5624192 not found: ID does not exist" Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.624532 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:50 crc kubenswrapper[4744]: I0311 02:46:50.638896 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-68264"] Mar 11 02:46:51 crc kubenswrapper[4744]: I0311 02:46:51.340126 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:51 crc kubenswrapper[4744]: I0311 02:46:51.340204 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:51 crc kubenswrapper[4744]: I0311 02:46:51.403309 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:52 crc kubenswrapper[4744]: I0311 02:46:52.005752 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca21227a-204d-477a-aae8-a7897115151f" path="/var/lib/kubelet/pods/ca21227a-204d-477a-aae8-a7897115151f/volumes" Mar 11 02:46:52 crc kubenswrapper[4744]: I0311 02:46:52.374626 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:53 crc kubenswrapper[4744]: I0311 02:46:53.785682 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.330773 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f4xg9" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="registry-server" containerID="cri-o://2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11" gracePeriod=2 Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.500412 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.500771 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.591757 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.790410 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.966137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7bm\" (UniqueName: \"kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm\") pod \"db79fe26-1042-4b93-927f-6c570a90a49d\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.966277 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities\") pod \"db79fe26-1042-4b93-927f-6c570a90a49d\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.966389 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content\") pod \"db79fe26-1042-4b93-927f-6c570a90a49d\" (UID: \"db79fe26-1042-4b93-927f-6c570a90a49d\") " Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.968269 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities" (OuterVolumeSpecName: "utilities") pod "db79fe26-1042-4b93-927f-6c570a90a49d" (UID: "db79fe26-1042-4b93-927f-6c570a90a49d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:54 crc kubenswrapper[4744]: I0311 02:46:54.971562 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm" (OuterVolumeSpecName: "kube-api-access-xc7bm") pod "db79fe26-1042-4b93-927f-6c570a90a49d" (UID: "db79fe26-1042-4b93-927f-6c570a90a49d"). InnerVolumeSpecName "kube-api-access-xc7bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.051668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db79fe26-1042-4b93-927f-6c570a90a49d" (UID: "db79fe26-1042-4b93-927f-6c570a90a49d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.069351 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.069394 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db79fe26-1042-4b93-927f-6c570a90a49d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.069408 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7bm\" (UniqueName: \"kubernetes.io/projected/db79fe26-1042-4b93-927f-6c570a90a49d-kube-api-access-xc7bm\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.345639 4744 generic.go:334] "Generic (PLEG): container finished" podID="db79fe26-1042-4b93-927f-6c570a90a49d" containerID="2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11" exitCode=0 Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.345732 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4xg9" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.345753 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerDied","Data":"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11"} Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.345815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4xg9" event={"ID":"db79fe26-1042-4b93-927f-6c570a90a49d","Type":"ContainerDied","Data":"b8e7c837402e0a10c0bf8ec1e6c825d2838da4d900459af4f86719a9b932b5b4"} Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.345838 4744 scope.go:117] "RemoveContainer" containerID="2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.374970 4744 scope.go:117] "RemoveContainer" containerID="bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.401152 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.419205 4744 scope.go:117] "RemoveContainer" containerID="e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.420289 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f4xg9"] Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.425116 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.478361 4744 scope.go:117] "RemoveContainer" containerID="2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11" Mar 11 02:46:55 crc kubenswrapper[4744]: E0311 02:46:55.479020 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11\": container with ID starting with 2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11 not found: ID does not exist" containerID="2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.479197 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11"} err="failed to get container status \"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11\": rpc error: code = NotFound desc = could not find container \"2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11\": container with ID starting with 2fa3d8da961afde64a33b911d8ec05ffb8b5b4fa21478ac303c7ed26de1fda11 not found: ID does not exist" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.479385 4744 scope.go:117] "RemoveContainer" containerID="bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2" Mar 11 02:46:55 crc kubenswrapper[4744]: E0311 02:46:55.479959 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2\": container with ID starting with bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2 not found: ID does not exist" containerID="bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.480005 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2"} err="failed to get container status \"bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2\": rpc error: code = NotFound desc = could not find container \"bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2\": container with ID starting with bea1422fcc5a27f26873fa2b861154b29c43a7ca73ffe16165b9db516255f3a2 not found: ID does not exist" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.480035 4744 scope.go:117] "RemoveContainer" containerID="e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1" Mar 11 02:46:55 crc kubenswrapper[4744]: E0311 02:46:55.480609 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1\": container with ID starting with e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1 not found: ID does not exist" containerID="e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.480797 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1"} err="failed to get container status \"e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1\": rpc error: code = NotFound desc = could not find container \"e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1\": container with ID starting with e9058cfd7080074d79154bf42592c86773d012a92beb518b8c46bffc62515bf1 not found: ID does not exist" Mar 11 02:46:55 crc kubenswrapper[4744]: I0311 02:46:55.993000 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" path="/var/lib/kubelet/pods/db79fe26-1042-4b93-927f-6c570a90a49d/volumes" Mar 11 02:46:56 crc kubenswrapper[4744]: I0311 02:46:56.986315 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:57 crc kubenswrapper[4744]: I0311 02:46:57.367201 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sw9hx" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="registry-server" containerID="cri-o://5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388" gracePeriod=2 Mar 11 02:46:57 crc kubenswrapper[4744]: I0311 02:46:57.872888 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.032020 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content\") pod \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.032075 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd48h\" (UniqueName: \"kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h\") pod \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.032245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities\") pod \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\" (UID: \"28c2ead7-f71e-45fa-bb1f-7e97702113b7\") " Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.033081 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities" (OuterVolumeSpecName: "utilities") pod "28c2ead7-f71e-45fa-bb1f-7e97702113b7" (UID: "28c2ead7-f71e-45fa-bb1f-7e97702113b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.039896 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h" (OuterVolumeSpecName: "kube-api-access-zd48h") pod "28c2ead7-f71e-45fa-bb1f-7e97702113b7" (UID: "28c2ead7-f71e-45fa-bb1f-7e97702113b7"). InnerVolumeSpecName "kube-api-access-zd48h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.063054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28c2ead7-f71e-45fa-bb1f-7e97702113b7" (UID: "28c2ead7-f71e-45fa-bb1f-7e97702113b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.134185 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd48h\" (UniqueName: \"kubernetes.io/projected/28c2ead7-f71e-45fa-bb1f-7e97702113b7-kube-api-access-zd48h\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.134429 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.134463 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c2ead7-f71e-45fa-bb1f-7e97702113b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.375710 4744 generic.go:334] "Generic (PLEG): container finished" podID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerID="5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388" exitCode=0 Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.375789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerDied","Data":"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388"} Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.376007 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9hx" event={"ID":"28c2ead7-f71e-45fa-bb1f-7e97702113b7","Type":"ContainerDied","Data":"edfb8f064e5f68b50022b304db1aa6d3bed0efb7d7aa47e0a4fb7127a0fa7ee5"} Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.376034 4744 scope.go:117] "RemoveContainer" containerID="5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.375856 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9hx" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.412466 4744 scope.go:117] "RemoveContainer" containerID="8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.436322 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.454222 4744 scope.go:117] "RemoveContainer" containerID="8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.455550 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9hx"] Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.489881 4744 scope.go:117] "RemoveContainer" containerID="5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388" Mar 11 02:46:58 crc kubenswrapper[4744]: E0311 02:46:58.491435 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388\": container with ID starting with 5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388 not found: ID does not exist" containerID="5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.491477 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388"} err="failed to get container status \"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388\": rpc error: code = NotFound desc = could not find container \"5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388\": container with ID starting with 5e6a55ea071ab8eee63d119688ceb71d4120016a8f8d89a419d8915f38c33388 not found: ID does not exist" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.491615 4744 scope.go:117] "RemoveContainer" containerID="8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b" Mar 11 02:46:58 crc kubenswrapper[4744]: E0311 02:46:58.492078 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b\": container with ID starting with 8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b not found: ID does not exist" containerID="8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.492111 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b"} err="failed to get container status \"8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b\": rpc error: code = NotFound desc = could not find container \"8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b\": container with ID starting with 8d36f09360d57fb0f4ffd8a98eb252ccb2764ddf819ad1e4188510a65e4d653b not found: ID does not exist" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.492131 4744 scope.go:117] "RemoveContainer" containerID="8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b" Mar 11 02:46:58 crc kubenswrapper[4744]: E0311 02:46:58.492386 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b\": container with ID starting with 8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b not found: ID does not exist" containerID="8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b" Mar 11 02:46:58 crc kubenswrapper[4744]: I0311 02:46:58.492405 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b"} err="failed to get container status \"8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b\": rpc error: code = NotFound desc = could not find container \"8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b\": container with ID starting with 8fef10cc65ac57d9a36d2be2fef74007f1de230af45f8672ecda85a80493fc9b not found: ID does not exist" Mar 11 02:46:59 crc kubenswrapper[4744]: I0311 02:46:59.993231 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" path="/var/lib/kubelet/pods/28c2ead7-f71e-45fa-bb1f-7e97702113b7/volumes" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.004004 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/util/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.131492 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/util/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.134652 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/pull/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.220248 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/pull/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.413877 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/pull/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.425739 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/util/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.433708 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bac8qt7_1e9ee3f9-00c2-4838-b6e8-acc23ff0ffe9/extract/0.log" Mar 11 02:47:02 crc kubenswrapper[4744]: I0311 02:47:02.815837 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-x5rps_9dd5fbf6-4c71-4ed4-b31b-4e1d43c34c73/manager/0.log" Mar 11 02:47:03 crc kubenswrapper[4744]: I0311 02:47:03.104582 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-wgnqt_115ee58b-0cd9-4993-b15e-226885cef1d8/manager/0.log" Mar 11 02:47:03 crc kubenswrapper[4744]: I0311 02:47:03.215863 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-gd9tg_c1595560-d9f2-48bf-8b30-f8a36f13e1f4/manager/0.log" Mar 11 02:47:03 crc kubenswrapper[4744]: I0311 02:47:03.392385 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-pr4mx_4b8d6717-6a3c-4421-83c8-c86ff18d1e3b/manager/0.log" Mar 11 02:47:03 crc kubenswrapper[4744]: I0311 02:47:03.932262 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-m6668_cf4745d2-2e5f-4150-9c60-34e91e5f1e80/manager/0.log" Mar 11 02:47:04 crc kubenswrapper[4744]: I0311 02:47:04.120913 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-hxxgl_daef3605-2bdc-4e16-b55f-61d2d3cfc2fd/manager/0.log" Mar 11 02:47:04 crc kubenswrapper[4744]: I0311 02:47:04.381313 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-c946t_f5ec1f31-33e9-47d8-91bc-450e319479a3/manager/0.log" Mar 11 02:47:04 crc kubenswrapper[4744]: I0311 02:47:04.501662 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-6rtj8_6b30e096-2cc6-41f3-aaca-c2d7a3d8b138/manager/0.log" Mar 11 02:47:04 crc kubenswrapper[4744]: I0311 02:47:04.717330 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-gc9cm_3a8c82be-b391-42a8-a16c-a99850c14b19/manager/0.log" Mar 11 02:47:04 crc kubenswrapper[4744]: I0311 02:47:04.969070 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-wt7zj_431b2a18-7bf7-4dda-a49e-15bfa629e1f9/manager/0.log" Mar 11 02:47:05 crc kubenswrapper[4744]: I0311 02:47:05.200181 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-7lh6p_1a7f92f2-6b8c-4ea2-ab12-c9ebcb043600/manager/0.log" Mar 11 02:47:05 crc kubenswrapper[4744]: I0311 02:47:05.272984 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-cp8jm_6e760654-b96d-4979-a98f-dc162fc1b41e/manager/0.log" Mar 11 02:47:05 crc kubenswrapper[4744]: I0311 02:47:05.393072 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-fmvh8_63676a12-7f22-401f-98e2-eb2495777d96/manager/0.log" Mar 11 02:47:05 crc kubenswrapper[4744]: I0311 02:47:05.539174 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f8pxr2_ffd0a9c9-75f4-4721-a045-b4f9dc285388/manager/0.log" Mar 11 02:47:05 crc kubenswrapper[4744]: I0311 02:47:05.729927 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-8965x_d69a58f4-054d-456c-9ea1-c68b893dadb4/operator/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.087474 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p7t5b_04455556-98d1-4461-945a-9a5b74b6508f/registry-server/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.117632 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-mgfnd_c2d319ab-e093-4ff0-8a47-d73b7ffc8a68/manager/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.293721 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-wk9q5_78b3e53a-dda4-4cc9-bd65-e2bcaedb3d2b/manager/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.380341 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-924zq_5f30ecde-348a-43ce-980f-b27ebd7971bb/operator/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.490715 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-h9rf2_0c8c19f0-aa8c-4ab0-9b6a-7600684d5bc8/manager/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.745555 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-qrwcx_1b8a814c-cd04-49bf-b774-441c96a1faa4/manager/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.795053 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rrbl2_cf1ef450-174d-43d9-b921-ce85078476b4/manager/0.log" Mar 11 02:47:06 crc kubenswrapper[4744]: I0311 02:47:06.961133 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-qtvcf_27f5807c-3d27-4ffc-bcc7-c43a07d04fa1/manager/0.log" Mar 11 02:47:07 crc kubenswrapper[4744]: I0311 02:47:07.019440 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-8trgb_f341be86-aa09-4703-aa02-29ef571ad003/manager/0.log" Mar 11 02:47:14 crc kubenswrapper[4744]: I0311 02:47:14.535457 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-xh4cc_ab0713f2-46ce-4987-852a-18371473f327/manager/0.log" Mar 11 02:47:28 crc kubenswrapper[4744]: I0311 02:47:28.950335 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-f8rpg_1ab34621-2907-423a-81ef-36fb8377874d/control-plane-machine-set-operator/0.log" Mar 11 02:47:29 crc kubenswrapper[4744]: I0311 02:47:29.121032 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lws9c_7d1c92dd-43a7-4311-90b1-54441f84787e/kube-rbac-proxy/0.log" Mar 11 02:47:29 crc kubenswrapper[4744]: I0311 02:47:29.150370 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lws9c_7d1c92dd-43a7-4311-90b1-54441f84787e/machine-api-operator/0.log" Mar 11 02:47:44 crc kubenswrapper[4744]: I0311 02:47:44.094612 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9dkcx_351f588e-9ef6-498a-a322-b2f00dad1d35/cert-manager-controller/0.log" Mar 11 02:47:44 crc kubenswrapper[4744]: I0311 02:47:44.192539 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-gdbpx_9496814f-7ec3-4763-a421-1a050c4b1ff5/cert-manager-cainjector/0.log" Mar 11 02:47:44 crc kubenswrapper[4744]: I0311 02:47:44.300962 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-wk9vl_68995fde-1ad0-4641-8cb3-2af8f1117cfd/cert-manager-webhook/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.259048 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-6jfrf_3af6f71e-7739-4a00-8f26-42d043c0d179/nmstate-console-plugin/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.451269 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4gtjr_61689687-6d5f-4f04-bd77-cf749c0a77ee/nmstate-handler/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.495690 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-k2dqq_5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc/kube-rbac-proxy/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.515120 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-k2dqq_5fd864c8-f5b3-4d6d-a9b0-85bb8661f5dc/nmstate-metrics/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.638310 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-lc2cz_385a32da-b61b-4128-b192-6ad240a2a6e8/nmstate-operator/0.log" Mar 11 02:47:59 crc kubenswrapper[4744]: I0311 02:47:59.656688 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-bm8mj_8cc11049-1fae-4c88-acad-91cb1622c0bc/nmstate-webhook/0.log" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.160643 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553288-flrqj"] Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161356 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161376 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161390 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161398 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161415 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161423 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161433 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161440 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161456 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161463 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161479 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161487 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161503 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161588 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161604 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161611 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="extract-utilities" Mar 11 02:48:00 crc kubenswrapper[4744]: E0311 02:48:00.161628 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161635 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="extract-content" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161821 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79fe26-1042-4b93-927f-6c570a90a49d" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161843 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c2ead7-f71e-45fa-bb1f-7e97702113b7" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.161863 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca21227a-204d-477a-aae8-a7897115151f" containerName="registry-server" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.162632 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.164221 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.165972 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.168062 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.175985 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553288-flrqj"] Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.297805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdsb\" (UniqueName: \"kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb\") pod \"auto-csr-approver-29553288-flrqj\" (UID: \"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a\") " pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.400548 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdsb\" (UniqueName: \"kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb\") pod \"auto-csr-approver-29553288-flrqj\" (UID: \"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a\") " pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.432618 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdsb\" (UniqueName: \"kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb\") pod \"auto-csr-approver-29553288-flrqj\" (UID: \"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a\") " pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.478825 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:00 crc kubenswrapper[4744]: I0311 02:48:00.979441 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553288-flrqj"] Mar 11 02:48:00 crc kubenswrapper[4744]: W0311 02:48:00.987724 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ce2f4e7_fe5e_4f49_a763_0b113adedb6a.slice/crio-f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae WatchSource:0}: Error finding container f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae: Status 404 returned error can't find the container with id f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae Mar 11 02:48:01 crc kubenswrapper[4744]: I0311 02:48:01.010000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553288-flrqj" event={"ID":"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a","Type":"ContainerStarted","Data":"f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae"} Mar 11 02:48:03 crc kubenswrapper[4744]: I0311 02:48:03.026276 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ce2f4e7-fe5e-4f49-a763-0b113adedb6a" containerID="7a2e6060c28d91017dae870ec76075115ef64c2ec16669acc5a9323c4e8e9d1b" exitCode=0 Mar 11 02:48:03 crc kubenswrapper[4744]: I0311 02:48:03.026443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553288-flrqj" event={"ID":"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a","Type":"ContainerDied","Data":"7a2e6060c28d91017dae870ec76075115ef64c2ec16669acc5a9323c4e8e9d1b"} Mar 11 02:48:04 crc kubenswrapper[4744]: I0311 02:48:04.362413 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:04 crc kubenswrapper[4744]: I0311 02:48:04.365031 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttdsb\" (UniqueName: \"kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb\") pod \"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a\" (UID: \"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a\") " Mar 11 02:48:04 crc kubenswrapper[4744]: I0311 02:48:04.371798 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb" (OuterVolumeSpecName: "kube-api-access-ttdsb") pod "2ce2f4e7-fe5e-4f49-a763-0b113adedb6a" (UID: "2ce2f4e7-fe5e-4f49-a763-0b113adedb6a"). InnerVolumeSpecName "kube-api-access-ttdsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:48:04 crc kubenswrapper[4744]: I0311 02:48:04.466534 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttdsb\" (UniqueName: \"kubernetes.io/projected/2ce2f4e7-fe5e-4f49-a763-0b113adedb6a-kube-api-access-ttdsb\") on node \"crc\" DevicePath \"\"" Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.052122 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553288-flrqj" event={"ID":"2ce2f4e7-fe5e-4f49-a763-0b113adedb6a","Type":"ContainerDied","Data":"f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae"} Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.052159 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c97fb474acafdf53b887053fb2d2525380716b49e30b7a99367f15670a34ae" Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.052208 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553288-flrqj" Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.423013 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553282-xtj65"] Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.429269 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553282-xtj65"] Mar 11 02:48:05 crc kubenswrapper[4744]: I0311 02:48:05.988131 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38d875a-4ced-4b03-9d48-e28f25888a1b" path="/var/lib/kubelet/pods/d38d875a-4ced-4b03-9d48-e28f25888a1b/volumes" Mar 11 02:48:12 crc kubenswrapper[4744]: I0311 02:48:12.409145 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:48:12 crc kubenswrapper[4744]: I0311 02:48:12.409929 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:48:18 crc kubenswrapper[4744]: I0311 02:48:18.741252 4744 scope.go:117] "RemoveContainer" containerID="8195e789fb44a7f42c07d4aa739650c54ba686ab4836dbc82ee8f6cf38509f36" Mar 11 02:48:31 crc kubenswrapper[4744]: I0311 02:48:31.684136 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-dncxf_6f9ffaa2-fe65-4509-b1a2-4577548128ae/kube-rbac-proxy/0.log" Mar 11 02:48:31 crc kubenswrapper[4744]: I0311 02:48:31.906273 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-frr-files/0.log" Mar 11 02:48:31 crc kubenswrapper[4744]: I0311 02:48:31.999413 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-dncxf_6f9ffaa2-fe65-4509-b1a2-4577548128ae/controller/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.121452 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-frr-files/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.125753 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-metrics/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.140498 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-reloader/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.146990 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-reloader/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.331345 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-frr-files/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.359316 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-reloader/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.360318 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-metrics/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.398259 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-metrics/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.587741 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-reloader/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.588677 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-frr-files/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.601013 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/cp-metrics/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.628289 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/controller/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.768550 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/frr-metrics/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.771358 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/kube-rbac-proxy/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.832362 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/kube-rbac-proxy-frr/0.log" Mar 11 02:48:32 crc kubenswrapper[4744]: I0311 02:48:32.989622 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/reloader/0.log" Mar 11 02:48:33 crc kubenswrapper[4744]: I0311 02:48:33.057115 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-pm8pv_0cf3eb75-8045-4beb-b5bf-68879b344482/frr-k8s-webhook-server/0.log" Mar 11 02:48:33 crc kubenswrapper[4744]: I0311 02:48:33.236456 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c4458d67b-6zbc2_e20b89e2-b171-4bef-877a-b8670fb99ce4/manager/0.log" Mar 11 02:48:33 crc kubenswrapper[4744]: I0311 02:48:33.335465 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-76dbb5d8c-qwk8s_743b9bba-abd5-45d5-b1ce-b59c1e7182a6/webhook-server/0.log" Mar 11 02:48:33 crc kubenswrapper[4744]: I0311 02:48:33.442065 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czr59_bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49/kube-rbac-proxy/0.log" Mar 11 02:48:34 crc kubenswrapper[4744]: I0311 02:48:34.054137 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czr59_bf3cb003-2eb3-4bd2-9ed2-4e2db93f0c49/speaker/0.log" Mar 11 02:48:34 crc kubenswrapper[4744]: I0311 02:48:34.766733 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-65fd2_21573ca2-d902-4d30-b94a-7b5ae891e084/frr/0.log" Mar 11 02:48:42 crc kubenswrapper[4744]: I0311 02:48:42.409300 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:48:42 crc kubenswrapper[4744]: I0311 02:48:42.409866 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.436894 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/util/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.566824 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/pull/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.569961 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/util/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.646987 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/pull/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.741110 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/extract/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.745943 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/util/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.754550 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82lpr5r_e5525fbf-f26b-400d-bcb1-1489bcfc7476/pull/0.log" Mar 11 02:48:49 crc kubenswrapper[4744]: I0311 02:48:49.881282 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/util/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.093739 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/util/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.133837 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/pull/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.155399 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/pull/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.280980 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/util/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.340327 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/extract/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.354236 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59qt5s_807021c7-f538-4f3a-abb8-b5eecaa837b0/pull/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.526006 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-utilities/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.637756 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-content/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.647017 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-utilities/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.647149 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-content/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.785861 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-content/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.816076 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/extract-utilities/0.log" Mar 11 02:48:50 crc kubenswrapper[4744]: I0311 02:48:50.971211 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-utilities/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.189240 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-content/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.228306 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-content/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.230632 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-utilities/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.377273 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-utilities/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.525351 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/extract-content/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.686521 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/util/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.764252 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf2dx_62146385-9b56-4dcc-9698-f63685b49374/registry-server/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.845587 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/util/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.880764 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/pull/0.log" Mar 11 02:48:51 crc kubenswrapper[4744]: I0311 02:48:51.993642 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/pull/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.186707 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/pull/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.206595 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/util/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.228899 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4mmpw4_38f80f5b-94aa-4852-a041-427b37320e97/extract/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.393880 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng6wp_34fd0e84-9ac8-4c64-94e2-9e774f709cda/registry-server/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.426072 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mbjqk_14ecea06-1017-42af-b26b-2859e4f4db7f/marketplace-operator/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.543728 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-utilities/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.700312 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-utilities/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.732482 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-content/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.744811 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-content/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.887164 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-content/0.log" Mar 11 02:48:52 crc kubenswrapper[4744]: I0311 02:48:52.899774 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/extract-utilities/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.099648 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-utilities/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.135754 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4z9p9_7ea4c0d9-6a60-4dbf-b768-4a2c39b36f24/registry-server/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.251383 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-utilities/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.252246 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-content/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.281949 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-content/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.467731 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-utilities/0.log" Mar 11 02:48:53 crc kubenswrapper[4744]: I0311 02:48:53.469920 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/extract-content/0.log" Mar 11 02:48:54 crc kubenswrapper[4744]: I0311 02:48:54.183268 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9h79z_3b2fa563-23f0-4670-a9db-c24f901242ba/registry-server/0.log" Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.409323 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.411372 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.411621 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.412507 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.412758 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b" gracePeriod=600 Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.641680 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b" exitCode=0 Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.641761 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b"} Mar 11 02:49:12 crc kubenswrapper[4744]: I0311 02:49:12.641948 4744 scope.go:117] "RemoveContainer" containerID="7b8ee7052cfbcac075f3aaf8e56a0a7732d17a49604ba786945b064164a25ca7" Mar 11 02:49:13 crc kubenswrapper[4744]: I0311 02:49:13.650413 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerStarted","Data":"867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7"} Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.770116 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:21 crc kubenswrapper[4744]: E0311 02:49:21.770925 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce2f4e7-fe5e-4f49-a763-0b113adedb6a" containerName="oc" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.770940 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce2f4e7-fe5e-4f49-a763-0b113adedb6a" containerName="oc" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.771281 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce2f4e7-fe5e-4f49-a763-0b113adedb6a" containerName="oc" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.772903 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.785761 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.874141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhwk9\" (UniqueName: \"kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.874179 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.874368 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.975613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.975703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhwk9\" (UniqueName: \"kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.975722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.976132 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.976337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:21 crc kubenswrapper[4744]: I0311 02:49:21.994295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhwk9\" (UniqueName: \"kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9\") pod \"certified-operators-zftlg\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:22 crc kubenswrapper[4744]: I0311 02:49:22.092442 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:22 crc kubenswrapper[4744]: I0311 02:49:22.546477 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:22 crc kubenswrapper[4744]: I0311 02:49:22.730681 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerStarted","Data":"5dfc304f3a426e6cc63535911f91b81105d49f6765137ef2809dc82fb4271095"} Mar 11 02:49:23 crc kubenswrapper[4744]: I0311 02:49:23.739270 4744 generic.go:334] "Generic (PLEG): container finished" podID="54cce732-86e1-49a1-b2c1-58658e666ded" containerID="547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01" exitCode=0 Mar 11 02:49:23 crc kubenswrapper[4744]: I0311 02:49:23.739495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerDied","Data":"547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01"} Mar 11 02:49:23 crc kubenswrapper[4744]: I0311 02:49:23.741558 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 02:49:24 crc kubenswrapper[4744]: I0311 02:49:24.752320 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerStarted","Data":"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559"} Mar 11 02:49:25 crc kubenswrapper[4744]: I0311 02:49:25.764432 4744 generic.go:334] "Generic (PLEG): container finished" podID="54cce732-86e1-49a1-b2c1-58658e666ded" containerID="5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559" exitCode=0 Mar 11 02:49:25 crc kubenswrapper[4744]: I0311 02:49:25.764556 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerDied","Data":"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559"} Mar 11 02:49:26 crc kubenswrapper[4744]: I0311 02:49:26.777780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerStarted","Data":"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba"} Mar 11 02:49:26 crc kubenswrapper[4744]: I0311 02:49:26.812036 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zftlg" podStartSLOduration=3.387957171 podStartE2EDuration="5.812010677s" podCreationTimestamp="2026-03-11 02:49:21 +0000 UTC" firstStartedPulling="2026-03-11 02:49:23.741327271 +0000 UTC m=+6920.545544876" lastFinishedPulling="2026-03-11 02:49:26.165380767 +0000 UTC m=+6922.969598382" observedRunningTime="2026-03-11 02:49:26.809014426 +0000 UTC m=+6923.613232061" watchObservedRunningTime="2026-03-11 02:49:26.812010677 +0000 UTC m=+6923.616228322" Mar 11 02:49:32 crc kubenswrapper[4744]: I0311 02:49:32.093397 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:32 crc kubenswrapper[4744]: I0311 02:49:32.093922 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:32 crc kubenswrapper[4744]: I0311 02:49:32.135824 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:32 crc kubenswrapper[4744]: I0311 02:49:32.907894 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:32 crc kubenswrapper[4744]: I0311 02:49:32.973246 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:34 crc kubenswrapper[4744]: I0311 02:49:34.850098 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zftlg" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="registry-server" containerID="cri-o://43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba" gracePeriod=2 Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.429430 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.526461 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhwk9\" (UniqueName: \"kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9\") pod \"54cce732-86e1-49a1-b2c1-58658e666ded\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.526542 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities\") pod \"54cce732-86e1-49a1-b2c1-58658e666ded\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.526623 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content\") pod \"54cce732-86e1-49a1-b2c1-58658e666ded\" (UID: \"54cce732-86e1-49a1-b2c1-58658e666ded\") " Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.528976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities" (OuterVolumeSpecName: "utilities") pod "54cce732-86e1-49a1-b2c1-58658e666ded" (UID: "54cce732-86e1-49a1-b2c1-58658e666ded"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.536823 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9" (OuterVolumeSpecName: "kube-api-access-jhwk9") pod "54cce732-86e1-49a1-b2c1-58658e666ded" (UID: "54cce732-86e1-49a1-b2c1-58658e666ded"). InnerVolumeSpecName "kube-api-access-jhwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.578007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54cce732-86e1-49a1-b2c1-58658e666ded" (UID: "54cce732-86e1-49a1-b2c1-58658e666ded"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.628563 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.628592 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cce732-86e1-49a1-b2c1-58658e666ded-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.628603 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhwk9\" (UniqueName: \"kubernetes.io/projected/54cce732-86e1-49a1-b2c1-58658e666ded-kube-api-access-jhwk9\") on node \"crc\" DevicePath \"\"" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.858119 4744 generic.go:334] "Generic (PLEG): container finished" podID="54cce732-86e1-49a1-b2c1-58658e666ded" containerID="43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba" exitCode=0 Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.858160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerDied","Data":"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba"} Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.858183 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zftlg" event={"ID":"54cce732-86e1-49a1-b2c1-58658e666ded","Type":"ContainerDied","Data":"5dfc304f3a426e6cc63535911f91b81105d49f6765137ef2809dc82fb4271095"} Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.858199 4744 scope.go:117] "RemoveContainer" containerID="43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.858297 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zftlg" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.901376 4744 scope.go:117] "RemoveContainer" containerID="5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.920960 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.942271 4744 scope.go:117] "RemoveContainer" containerID="547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01" Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.944292 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zftlg"] Mar 11 02:49:35 crc kubenswrapper[4744]: I0311 02:49:35.997909 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" path="/var/lib/kubelet/pods/54cce732-86e1-49a1-b2c1-58658e666ded/volumes" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.014533 4744 scope.go:117] "RemoveContainer" containerID="43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba" Mar 11 02:49:36 crc kubenswrapper[4744]: E0311 02:49:36.014950 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba\": container with ID starting with 43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba not found: ID does not exist" containerID="43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.014980 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba"} err="failed to get container status \"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba\": rpc error: code = NotFound desc = could not find container \"43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba\": container with ID starting with 43a3ceaa0ad5ec47be626ccad6ee847a0c8908898f7de3e120d3bc7cc1aa63ba not found: ID does not exist" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.015002 4744 scope.go:117] "RemoveContainer" containerID="5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559" Mar 11 02:49:36 crc kubenswrapper[4744]: E0311 02:49:36.015683 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559\": container with ID starting with 5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559 not found: ID does not exist" containerID="5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.015706 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559"} err="failed to get container status \"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559\": rpc error: code = NotFound desc = could not find container \"5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559\": container with ID starting with 5a823202bef47dda3e14a27bc882a3f0bd7f3182233ecb8e46f78d0e6c0f1559 not found: ID does not exist" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.015721 4744 scope.go:117] "RemoveContainer" containerID="547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01" Mar 11 02:49:36 crc kubenswrapper[4744]: E0311 02:49:36.015964 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01\": container with ID starting with 547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01 not found: ID does not exist" containerID="547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01" Mar 11 02:49:36 crc kubenswrapper[4744]: I0311 02:49:36.015982 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01"} err="failed to get container status \"547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01\": rpc error: code = NotFound desc = could not find container \"547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01\": container with ID starting with 547fa18467278e31c40b2ebb4f7a2a8c2bbac53fd9ca67c6f82de0fe11ee2c01 not found: ID does not exist" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.179482 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553290-x69r4"] Mar 11 02:50:00 crc kubenswrapper[4744]: E0311 02:50:00.180578 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="extract-utilities" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.180599 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="extract-utilities" Mar 11 02:50:00 crc kubenswrapper[4744]: E0311 02:50:00.180623 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="registry-server" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.180635 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="registry-server" Mar 11 02:50:00 crc kubenswrapper[4744]: E0311 02:50:00.180674 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="extract-content" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.180686 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="extract-content" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.180992 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cce732-86e1-49a1-b2c1-58658e666ded" containerName="registry-server" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.184915 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.187566 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.187923 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.188419 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.194887 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553290-x69r4"] Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.362078 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcn97\" (UniqueName: \"kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97\") pod \"auto-csr-approver-29553290-x69r4\" (UID: \"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc\") " pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.463616 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcn97\" (UniqueName: \"kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97\") pod \"auto-csr-approver-29553290-x69r4\" (UID: \"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc\") " pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.505248 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcn97\" (UniqueName: \"kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97\") pod \"auto-csr-approver-29553290-x69r4\" (UID: \"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc\") " pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:00 crc kubenswrapper[4744]: I0311 02:50:00.558845 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:01 crc kubenswrapper[4744]: I0311 02:50:01.085382 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553290-x69r4"] Mar 11 02:50:01 crc kubenswrapper[4744]: I0311 02:50:01.160460 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553290-x69r4" event={"ID":"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc","Type":"ContainerStarted","Data":"cea995575fd6e4e0d25eed2112de0e0d5d53db75ff45e7525ab482a14adbcc96"} Mar 11 02:50:03 crc kubenswrapper[4744]: I0311 02:50:03.185786 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553290-x69r4" event={"ID":"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc","Type":"ContainerStarted","Data":"7bc0e5baf9d32af5544c3ac4251fd439d416e378057e9037dc258d86a1a03828"} Mar 11 02:50:03 crc kubenswrapper[4744]: I0311 02:50:03.219175 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553290-x69r4" podStartSLOduration=1.7690234120000001 podStartE2EDuration="3.219152245s" podCreationTimestamp="2026-03-11 02:50:00 +0000 UTC" firstStartedPulling="2026-03-11 02:50:01.093908784 +0000 UTC m=+6957.898126399" lastFinishedPulling="2026-03-11 02:50:02.544037597 +0000 UTC m=+6959.348255232" observedRunningTime="2026-03-11 02:50:03.20960538 +0000 UTC m=+6960.013823015" watchObservedRunningTime="2026-03-11 02:50:03.219152245 +0000 UTC m=+6960.023369860" Mar 11 02:50:04 crc kubenswrapper[4744]: I0311 02:50:04.198129 4744 generic.go:334] "Generic (PLEG): container finished" podID="cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc" containerID="7bc0e5baf9d32af5544c3ac4251fd439d416e378057e9037dc258d86a1a03828" exitCode=0 Mar 11 02:50:04 crc kubenswrapper[4744]: I0311 02:50:04.198168 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553290-x69r4" event={"ID":"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc","Type":"ContainerDied","Data":"7bc0e5baf9d32af5544c3ac4251fd439d416e378057e9037dc258d86a1a03828"} Mar 11 02:50:05 crc kubenswrapper[4744]: I0311 02:50:05.617675 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:05 crc kubenswrapper[4744]: I0311 02:50:05.771324 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcn97\" (UniqueName: \"kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97\") pod \"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc\" (UID: \"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc\") " Mar 11 02:50:05 crc kubenswrapper[4744]: I0311 02:50:05.780449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97" (OuterVolumeSpecName: "kube-api-access-bcn97") pod "cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc" (UID: "cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc"). InnerVolumeSpecName "kube-api-access-bcn97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:50:05 crc kubenswrapper[4744]: I0311 02:50:05.876677 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcn97\" (UniqueName: \"kubernetes.io/projected/cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc-kube-api-access-bcn97\") on node \"crc\" DevicePath \"\"" Mar 11 02:50:06 crc kubenswrapper[4744]: I0311 02:50:06.222651 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553290-x69r4" event={"ID":"cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc","Type":"ContainerDied","Data":"cea995575fd6e4e0d25eed2112de0e0d5d53db75ff45e7525ab482a14adbcc96"} Mar 11 02:50:06 crc kubenswrapper[4744]: I0311 02:50:06.222715 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea995575fd6e4e0d25eed2112de0e0d5d53db75ff45e7525ab482a14adbcc96" Mar 11 02:50:06 crc kubenswrapper[4744]: I0311 02:50:06.222797 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553290-x69r4" Mar 11 02:50:06 crc kubenswrapper[4744]: I0311 02:50:06.304179 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553284-86ggt"] Mar 11 02:50:06 crc kubenswrapper[4744]: I0311 02:50:06.314280 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553284-86ggt"] Mar 11 02:50:07 crc kubenswrapper[4744]: I0311 02:50:07.992306 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f900d5-0d77-43f3-a274-5c5488b8b03c" path="/var/lib/kubelet/pods/48f900d5-0d77-43f3-a274-5c5488b8b03c/volumes" Mar 11 02:50:18 crc kubenswrapper[4744]: I0311 02:50:18.858237 4744 scope.go:117] "RemoveContainer" containerID="55e124aaa63812ade25310b009d2693f30abec3c7d241b5553b1b20a205f5d6a" Mar 11 02:50:25 crc kubenswrapper[4744]: I0311 02:50:25.413689 4744 generic.go:334] "Generic (PLEG): container finished" podID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerID="b94f773583c3a98f3b770276ba85b410eb9cadeae5eabeb798e897181aabce29" exitCode=0 Mar 11 02:50:25 crc kubenswrapper[4744]: I0311 02:50:25.413744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt92c/must-gather-c62jv" event={"ID":"76eef030-d104-4d21-85b9-d3d5be6456f5","Type":"ContainerDied","Data":"b94f773583c3a98f3b770276ba85b410eb9cadeae5eabeb798e897181aabce29"} Mar 11 02:50:25 crc kubenswrapper[4744]: I0311 02:50:25.414928 4744 scope.go:117] "RemoveContainer" containerID="b94f773583c3a98f3b770276ba85b410eb9cadeae5eabeb798e897181aabce29" Mar 11 02:50:26 crc kubenswrapper[4744]: I0311 02:50:26.077869 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt92c_must-gather-c62jv_76eef030-d104-4d21-85b9-d3d5be6456f5/gather/0.log" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.085336 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt92c/must-gather-c62jv"] Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.086192 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jt92c/must-gather-c62jv" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="copy" containerID="cri-o://b549092ef85ef388f7f6865bebaedb4c2adb7ea129e5f1d56808b134338cfedb" gracePeriod=2 Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.094262 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt92c/must-gather-c62jv"] Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.506601 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt92c_must-gather-c62jv_76eef030-d104-4d21-85b9-d3d5be6456f5/copy/0.log" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.507617 4744 generic.go:334] "Generic (PLEG): container finished" podID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerID="b549092ef85ef388f7f6865bebaedb4c2adb7ea129e5f1d56808b134338cfedb" exitCode=143 Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.507702 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e5ad951e4e17b7bebb4c07b3c8fc369addfe1aea5d8a0410d58e0c6fa93f17" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.550758 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt92c_must-gather-c62jv_76eef030-d104-4d21-85b9-d3d5be6456f5/copy/0.log" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.551263 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.638293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6b4t\" (UniqueName: \"kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t\") pod \"76eef030-d104-4d21-85b9-d3d5be6456f5\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.638378 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output\") pod \"76eef030-d104-4d21-85b9-d3d5be6456f5\" (UID: \"76eef030-d104-4d21-85b9-d3d5be6456f5\") " Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.645752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t" (OuterVolumeSpecName: "kube-api-access-q6b4t") pod "76eef030-d104-4d21-85b9-d3d5be6456f5" (UID: "76eef030-d104-4d21-85b9-d3d5be6456f5"). InnerVolumeSpecName "kube-api-access-q6b4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.740171 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6b4t\" (UniqueName: \"kubernetes.io/projected/76eef030-d104-4d21-85b9-d3d5be6456f5-kube-api-access-q6b4t\") on node \"crc\" DevicePath \"\"" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.765810 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "76eef030-d104-4d21-85b9-d3d5be6456f5" (UID: "76eef030-d104-4d21-85b9-d3d5be6456f5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 02:50:34 crc kubenswrapper[4744]: I0311 02:50:34.841983 4744 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eef030-d104-4d21-85b9-d3d5be6456f5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 02:50:35 crc kubenswrapper[4744]: I0311 02:50:35.513455 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt92c/must-gather-c62jv" Mar 11 02:50:35 crc kubenswrapper[4744]: I0311 02:50:35.996602 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" path="/var/lib/kubelet/pods/76eef030-d104-4d21-85b9-d3d5be6456f5/volumes" Mar 11 02:51:12 crc kubenswrapper[4744]: I0311 02:51:12.409166 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:51:12 crc kubenswrapper[4744]: I0311 02:51:12.409825 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:51:42 crc kubenswrapper[4744]: I0311 02:51:42.409385 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:51:42 crc kubenswrapper[4744]: I0311 02:51:42.410072 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.164078 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553292-xzvmp"] Mar 11 02:52:00 crc kubenswrapper[4744]: E0311 02:52:00.165368 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="copy" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165395 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="copy" Mar 11 02:52:00 crc kubenswrapper[4744]: E0311 02:52:00.165416 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="gather" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165429 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="gather" Mar 11 02:52:00 crc kubenswrapper[4744]: E0311 02:52:00.165455 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc" containerName="oc" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165468 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc" containerName="oc" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165837 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="gather" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165864 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eef030-d104-4d21-85b9-d3d5be6456f5" containerName="copy" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.165887 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb66e28-b1fe-4d4e-bcb0-b1f9d590fecc" containerName="oc" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.166811 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.170387 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.170437 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rs77s" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.171708 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.191102 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553292-xzvmp"] Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.258070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vztp\" (UniqueName: \"kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp\") pod \"auto-csr-approver-29553292-xzvmp\" (UID: \"21a16025-1d24-4c2d-82c3-798041500f11\") " pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.360963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vztp\" (UniqueName: \"kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp\") pod \"auto-csr-approver-29553292-xzvmp\" (UID: \"21a16025-1d24-4c2d-82c3-798041500f11\") " pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.398599 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vztp\" (UniqueName: \"kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp\") pod \"auto-csr-approver-29553292-xzvmp\" (UID: \"21a16025-1d24-4c2d-82c3-798041500f11\") " pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:00 crc kubenswrapper[4744]: I0311 02:52:00.499128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:01 crc kubenswrapper[4744]: I0311 02:52:01.031079 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553292-xzvmp"] Mar 11 02:52:01 crc kubenswrapper[4744]: I0311 02:52:01.465914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" event={"ID":"21a16025-1d24-4c2d-82c3-798041500f11","Type":"ContainerStarted","Data":"cd44b4deeff48dff9b845411ebbdede8eee794d8f939c7df28465e396849c7b5"} Mar 11 02:52:03 crc kubenswrapper[4744]: I0311 02:52:03.491884 4744 generic.go:334] "Generic (PLEG): container finished" podID="21a16025-1d24-4c2d-82c3-798041500f11" containerID="e8825168bbdb35aeff81461bbacbd0305d3497a3b4f88b02815006e19a2f0d7c" exitCode=0 Mar 11 02:52:03 crc kubenswrapper[4744]: I0311 02:52:03.492479 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" event={"ID":"21a16025-1d24-4c2d-82c3-798041500f11","Type":"ContainerDied","Data":"e8825168bbdb35aeff81461bbacbd0305d3497a3b4f88b02815006e19a2f0d7c"} Mar 11 02:52:04 crc kubenswrapper[4744]: I0311 02:52:04.927337 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:04 crc kubenswrapper[4744]: I0311 02:52:04.972854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vztp\" (UniqueName: \"kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp\") pod \"21a16025-1d24-4c2d-82c3-798041500f11\" (UID: \"21a16025-1d24-4c2d-82c3-798041500f11\") " Mar 11 02:52:04 crc kubenswrapper[4744]: I0311 02:52:04.995312 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp" (OuterVolumeSpecName: "kube-api-access-8vztp") pod "21a16025-1d24-4c2d-82c3-798041500f11" (UID: "21a16025-1d24-4c2d-82c3-798041500f11"). InnerVolumeSpecName "kube-api-access-8vztp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 02:52:05 crc kubenswrapper[4744]: I0311 02:52:05.075921 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vztp\" (UniqueName: \"kubernetes.io/projected/21a16025-1d24-4c2d-82c3-798041500f11-kube-api-access-8vztp\") on node \"crc\" DevicePath \"\"" Mar 11 02:52:05 crc kubenswrapper[4744]: I0311 02:52:05.515027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" event={"ID":"21a16025-1d24-4c2d-82c3-798041500f11","Type":"ContainerDied","Data":"cd44b4deeff48dff9b845411ebbdede8eee794d8f939c7df28465e396849c7b5"} Mar 11 02:52:05 crc kubenswrapper[4744]: I0311 02:52:05.515062 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd44b4deeff48dff9b845411ebbdede8eee794d8f939c7df28465e396849c7b5" Mar 11 02:52:05 crc kubenswrapper[4744]: I0311 02:52:05.515369 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553292-xzvmp" Mar 11 02:52:06 crc kubenswrapper[4744]: I0311 02:52:06.001368 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553286-gvvcf"] Mar 11 02:52:06 crc kubenswrapper[4744]: I0311 02:52:06.006612 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553286-gvvcf"] Mar 11 02:52:07 crc kubenswrapper[4744]: I0311 02:52:07.992692 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a97075-eefa-4f1a-b520-b3ce094b7413" path="/var/lib/kubelet/pods/58a97075-eefa-4f1a-b520-b3ce094b7413/volumes" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.409561 4744 patch_prober.go:28] interesting pod/machine-config-daemon-678nx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.410343 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.410430 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-678nx" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.411793 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7"} pod="openshift-machine-config-operator/machine-config-daemon-678nx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.411898 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerName="machine-config-daemon" containerID="cri-o://867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" gracePeriod=600 Mar 11 02:52:12 crc kubenswrapper[4744]: E0311 02:52:12.547187 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.598233 4744 generic.go:334] "Generic (PLEG): container finished" podID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" containerID="867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" exitCode=0 Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.598295 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-678nx" event={"ID":"a15dc7ac-7c34-4135-b6eb-a85122800ce9","Type":"ContainerDied","Data":"867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7"} Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.598339 4744 scope.go:117] "RemoveContainer" containerID="72c4c724e8cb1e13760e1df0ad3d11a092cb6f1b4892570fe646d673551f7a5b" Mar 11 02:52:12 crc kubenswrapper[4744]: I0311 02:52:12.599358 4744 scope.go:117] "RemoveContainer" containerID="867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" Mar 11 02:52:12 crc kubenswrapper[4744]: E0311 02:52:12.599899 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:52:19 crc kubenswrapper[4744]: I0311 02:52:18.999625 4744 scope.go:117] "RemoveContainer" containerID="2c18008c454c81d2418635809c131d5a5bb3f609efd347876a2e33b7ae2ad4fd" Mar 11 02:52:19 crc kubenswrapper[4744]: I0311 02:52:19.042050 4744 scope.go:117] "RemoveContainer" containerID="b549092ef85ef388f7f6865bebaedb4c2adb7ea129e5f1d56808b134338cfedb" Mar 11 02:52:19 crc kubenswrapper[4744]: I0311 02:52:19.080298 4744 scope.go:117] "RemoveContainer" containerID="b4bc7a03dafbe98d63eb695bcb550fed4d0f8e0f5ec40f99bdec48b7b01b08a0" Mar 11 02:52:19 crc kubenswrapper[4744]: I0311 02:52:19.130859 4744 scope.go:117] "RemoveContainer" containerID="b94f773583c3a98f3b770276ba85b410eb9cadeae5eabeb798e897181aabce29" Mar 11 02:52:25 crc kubenswrapper[4744]: I0311 02:52:25.976113 4744 scope.go:117] "RemoveContainer" containerID="867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" Mar 11 02:52:25 crc kubenswrapper[4744]: E0311 02:52:25.977659 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:52:39 crc kubenswrapper[4744]: I0311 02:52:39.975016 4744 scope.go:117] "RemoveContainer" containerID="867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" Mar 11 02:52:39 crc kubenswrapper[4744]: E0311 02:52:39.976388 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9" Mar 11 02:52:52 crc kubenswrapper[4744]: I0311 02:52:52.975617 4744 scope.go:117] "RemoveContainer" containerID="867fe62c68fc64bbbe9fc068c1ec0f2fbc22ec45bffc65a7807699fdd918c7b7" Mar 11 02:52:52 crc kubenswrapper[4744]: E0311 02:52:52.976427 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-678nx_openshift-machine-config-operator(a15dc7ac-7c34-4135-b6eb-a85122800ce9)\"" pod="openshift-machine-config-operator/machine-config-daemon-678nx" podUID="a15dc7ac-7c34-4135-b6eb-a85122800ce9"